Mar 07 06:59:40 crc systemd[1]: Starting Kubernetes Kubelet... Mar 07 06:59:40 crc restorecon[4689]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:40 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 07 06:59:41 crc restorecon[4689]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 06:59:42 crc kubenswrapper[4738]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.135553 4738 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140591 4738 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140617 4738 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140624 4738 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140630 4738 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140636 4738 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140643 4738 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140649 4738 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140655 4738 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140661 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140667 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140673 4738 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140677 4738 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140682 4738 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140697 4738 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140702 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140708 4738 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140714 4738 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140719 4738 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140723 4738 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140728 4738 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140744 4738 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140749 4738 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140754 4738 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140760 4738 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140766 4738 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140771 4738 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140776 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140780 4738 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140785 4738 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140789 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140794 4738 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140798 4738 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140803 4738 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140809 4738 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140814 4738 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140819 4738 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140823 4738 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140827 4738 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140831 4738 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140835 4738 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140839 4738 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140843 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140850 4738 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140855 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140860 4738 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140864 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140868 4738 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140872 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140876 4738 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140881 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140885 4738 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140890 4738 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140894 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140898 4738 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140903 4738 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140907 4738 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140911 4738 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140916 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140920 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140925 4738 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140929 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140933 4738 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140937 4738 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140941 4738 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140945 4738 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140949 4738 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140953 4738 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140959 4738 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140963 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140968 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.140972 4738 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141836 4738 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141870 4738 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141884 4738 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141891 4738 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141900 4738 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141906 4738 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141914 4738 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141922 4738 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141928 4738 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141933 4738 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141941 4738 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141947 4738 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141953 4738 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141958 4738 flags.go:64] FLAG: --cgroup-root="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141963 4738 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141969 4738 flags.go:64] FLAG: --client-ca-file="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141974 4738 flags.go:64] FLAG: --cloud-config="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141979 4738 flags.go:64] FLAG: --cloud-provider="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141984 4738 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141991 4738 flags.go:64] FLAG: --cluster-domain="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.141996 4738 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142001 4738 flags.go:64] FLAG: --config-dir="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142005 4738 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142009 4738 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142015 4738 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142031 4738 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142037 4738 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142043 4738 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142048 4738 flags.go:64] FLAG: --contention-profiling="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142053 4738 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142058 4738 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142063 4738 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142077 4738 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142090 4738 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142095 4738 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142101 4738 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142106 4738 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142111 4738 flags.go:64] FLAG: --enable-server="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142116 4738 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142124 4738 flags.go:64] FLAG: --event-burst="100" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142129 4738 flags.go:64] FLAG: --event-qps="50" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142134 4738 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142143 4738 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142149 4738 flags.go:64] FLAG: --eviction-hard="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142173 4738 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142179 4738 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142185 4738 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142190 4738 flags.go:64] FLAG: --eviction-soft="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142195 4738 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142200 4738 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142205 4738 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142211 4738 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142216 4738 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142221 4738 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142226 4738 flags.go:64] FLAG: --feature-gates="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142233 4738 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142238 4738 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142244 4738 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142249 4738 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142254 4738 flags.go:64] FLAG: --healthz-port="10248" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142258 4738 flags.go:64] FLAG: --help="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142262 4738 flags.go:64] FLAG: --hostname-override="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142266 4738 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142271 4738 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142275 4738 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142279 4738 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142283 4738 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142288 4738 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142293 4738 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142297 4738 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142301 4738 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142305 4738 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142310 4738 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142314 4738 flags.go:64] FLAG: --kube-reserved="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142319 4738 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142324 4738 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142328 4738 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142333 4738 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142337 4738 flags.go:64] FLAG: --lock-file="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142341 4738 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142345 4738 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142349 4738 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142356 4738 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142361 4738 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142365 4738 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142369 4738 flags.go:64] FLAG: --logging-format="text" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142373 4738 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142378 4738 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142383 4738 flags.go:64] FLAG: --manifest-url="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142387 4738 flags.go:64] FLAG: --manifest-url-header="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142393 4738 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142398 4738 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142404 4738 flags.go:64] FLAG: --max-pods="110" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142408 4738 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142413 4738 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142417 4738 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142421 4738 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142426 4738 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142430 4738 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142435 4738 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142447 4738 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142452 4738 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142456 4738 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142460 4738 flags.go:64] FLAG: --pod-cidr="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142466 4738 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142475 4738 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142480 4738 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142485 4738 flags.go:64] FLAG: --pods-per-core="0" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142490 4738 flags.go:64] FLAG: --port="10250" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142494 4738 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142498 4738 flags.go:64] FLAG: --provider-id="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142503 4738 flags.go:64] FLAG: --qos-reserved="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142507 4738 flags.go:64] FLAG: --read-only-port="10255" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142511 4738 flags.go:64] FLAG: --register-node="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142516 4738 flags.go:64] FLAG: --register-schedulable="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142520 4738 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142528 4738 flags.go:64] FLAG: --registry-burst="10" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142532 4738 flags.go:64] FLAG: --registry-qps="5" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142536 4738 flags.go:64] FLAG: --reserved-cpus="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142540 4738 flags.go:64] FLAG: --reserved-memory="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142545 4738 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142550 4738 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142554 4738 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142558 4738 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142563 4738 flags.go:64] FLAG: --runonce="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142567 4738 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142571 4738 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142575 4738 flags.go:64] FLAG: --seccomp-default="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142579 4738 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142583 4738 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142588 4738 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142593 4738 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142598 4738 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142602 4738 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142606 4738 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142610 4738 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142616 4738 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142620 4738 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142625 4738 flags.go:64] FLAG: --system-cgroups="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142629 4738 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142637 4738 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142642 4738 flags.go:64] FLAG: --tls-cert-file="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142646 4738 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142651 4738 flags.go:64] FLAG: --tls-min-version="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142655 4738 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142659 4738 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142664 4738 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142668 4738 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142672 4738 flags.go:64] FLAG: --v="2" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142678 4738 flags.go:64] FLAG: --version="false" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142684 4738 flags.go:64] FLAG: --vmodule="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142690 4738 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.142694 4738 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142815 4738 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142820 4738 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142824 4738 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142828 4738 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142833 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142836 4738 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142840 4738 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142844 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142847 4738 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142851 4738 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142854 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142863 4738 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142867 4738 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142871 4738 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142874 4738 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142878 4738 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142882 4738 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142885 4738 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142890 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142893 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142897 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142901 4738 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142904 4738 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142919 4738 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142923 4738 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142926 4738 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142930 4738 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142933 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142937 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142941 4738 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142944 4738 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142948 4738 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142952 4738 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142955 4738 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142960 4738 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142965 4738 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142969 4738 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142972 4738 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142976 4738 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142979 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142983 4738 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142986 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142990 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142996 4738 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.142999 4738 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143003 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143006 4738 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143009 4738 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143013 4738 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143017 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143026 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143030 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143035 4738 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143039 4738 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143043 4738 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143047 4738 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143050 4738 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143054 4738 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143057 4738 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143061 4738 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143065 4738 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143069 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143072 4738 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143076 4738 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143079 4738 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143083 4738 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143088 4738 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143093 4738 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143097 4738 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143101 4738 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.143106 4738 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.143119 4738 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.152674 4738 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.152730 4738 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152831 4738 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152852 4738 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152859 4738 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152865 4738 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152871 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152877 4738 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152883 4738 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152888 4738 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152893 4738 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152899 4738 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152904 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152909 4738 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152914 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152919 4738 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152926 4738 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152932 4738 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152937 4738 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152942 4738 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152948 4738 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152953 4738 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152958 4738 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152964 4738 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152970 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152976 4738 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152981 4738 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152988 4738 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.152998 4738 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153005 4738 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153010 4738 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153015 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153021 4738 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153026 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153032 4738 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153039 4738 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153045 4738 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153075 4738 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153082 4738 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153087 4738 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153092 4738 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153097 4738 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153103 4738 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153108 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153113 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153118 4738 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153123 4738 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153128 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153133 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153139 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153144 4738 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153150 4738 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153357 4738 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153367 4738 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153372 4738 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153377 4738 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153383 4738 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153390 4738 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153396 4738 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153401 4738 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153405 4738 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153410 4738 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153414 4738 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153420 4738 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153427 4738 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153432 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153437 4738 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153441 4738 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153446 4738 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153451 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153456 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153461 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153466 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.153476 4738 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153641 4738 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153654 4738 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153660 4738 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153667 4738 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153674 4738 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153680 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153686 4738 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153691 4738 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153696 4738 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153702 4738 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153708 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153713 4738 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153719 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153724 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153731 4738 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153738 4738 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153744 4738 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153750 4738 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153755 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153761 4738 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153766 4738 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153771 4738 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153776 4738 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153780 4738 feature_gate.go:330] unrecognized feature gate: Example Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153784 4738 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153789 4738 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153794 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153799 4738 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153805 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153809 4738 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153814 4738 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153819 4738 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153823 4738 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153828 4738 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153833 4738 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153837 4738 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153843 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153848 4738 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153852 4738 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153857 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153862 4738 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153867 4738 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153872 4738 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153876 4738 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153881 4738 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153886 4738 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153892 4738 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153897 4738 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153902 4738 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153907 4738 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153913 4738 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153917 4738 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153922 4738 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153926 4738 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153931 4738 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153936 4738 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153940 4738 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153946 4738 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153951 4738 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153956 4738 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153961 4738 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153966 4738 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153971 4738 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153975 4738 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153980 4738 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153984 4738 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153991 4738 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.153997 4738 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.154003 4738 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.154009 4738 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.154014 4738 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.154024 4738 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.154290 4738 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.158979 4738 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.162530 4738 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.162662 4738 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.164714 4738 server.go:997] "Starting client certificate rotation" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.164742 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.164920 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.193042 4738 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.197076 4738 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.199318 4738 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.220508 4738 log.go:25] "Validated CRI v1 runtime API" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.259991 4738 log.go:25] "Validated CRI v1 image API" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.262311 4738 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.267794 4738 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-07-06-55-48-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.267838 4738 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.285732 4738 manager.go:217] Machine: {Timestamp:2026-03-07 06:59:42.281433656 +0000 UTC m=+0.746420987 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:38a340c1-d2f7-4b78-944f-4f5f0a1624aa BootID:4314a625-f101-41ec-bd1c-f79c10f7f811 Filesystems:[{Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:c8:75:17 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:c8:75:17 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a3:b8:dd Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6f:31:07 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c0:5e:83 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:c1:bd:83 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:d2:0a:d4:d7:95:be Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:32:22:48:63:ef:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.286081 4738 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.286328 4738 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.287664 4738 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.287862 4738 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.287903 4738 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.290489 4738 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.290516 4738 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.290967 4738 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.290997 4738 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.291584 4738 state_mem.go:36] "Initialized new in-memory state store" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.291678 4738 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.297503 4738 kubelet.go:418] "Attempting to sync node with API server" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.297533 4738 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.297551 4738 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.297566 4738 kubelet.go:324] "Adding apiserver pod source" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.297579 4738 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.305078 4738 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.307242 4738 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.309652 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.309766 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.309897 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.309936 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.309930 4738 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312436 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312555 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312581 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312604 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312638 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312659 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312679 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312714 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312741 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312764 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312798 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.312820 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.316319 4738 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.317607 4738 server.go:1280] "Started kubelet" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.317853 4738 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.317891 4738 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.318594 4738 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 06:59:42 crc systemd[1]: Started Kubernetes Kubelet. Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.320906 4738 server.go:460] "Adding debug handlers to kubelet server" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321022 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321020 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321085 4738 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321150 4738 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321187 4738 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.321229 4738 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.322044 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.322231 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.322653 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.322747 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323368 4738 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323404 4738 factory.go:55] Registering systemd factory Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323415 4738 factory.go:221] Registration of the systemd container factory successfully Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323808 4738 factory.go:153] Registering CRI-O factory Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323826 4738 factory.go:221] Registration of the crio container factory successfully Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323856 4738 factory.go:103] Registering Raw factory Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.323878 4738 manager.go:1196] Started watching for new ooms in manager Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.324656 4738 manager.go:319] Starting recovery of all containers Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332637 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332698 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332723 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332738 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332758 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332772 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332788 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332807 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332825 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332845 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332857 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332874 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332889 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332909 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332921 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332939 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332952 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332966 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.332983 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.333382 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.333402 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334647 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334732 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334750 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334784 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334796 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334824 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334847 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334859 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334870 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334884 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334902 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334916 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334928 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334941 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334960 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334971 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334981 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.334995 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335011 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335028 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335039 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335050 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335065 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335078 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335095 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335107 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335120 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335135 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335147 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335189 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335202 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335227 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335246 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335268 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335287 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335301 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335318 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335330 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335345 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335359 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335376 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335390 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335747 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335763 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335778 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335794 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335815 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335826 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335837 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335854 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335868 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335893 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335910 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335927 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335946 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335961 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335978 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.335994 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336006 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336024 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336041 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336060 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336076 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336090 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336111 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336124 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336146 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336255 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336286 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336307 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336323 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336341 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336365 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336383 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336407 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336424 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336442 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336472 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336493 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336514 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336531 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336549 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336575 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336614 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336638 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336667 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.336692 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337062 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337099 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337117 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337132 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337148 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337178 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337197 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337211 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337228 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337240 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337252 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337269 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337280 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337295 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337307 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337320 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337336 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337348 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337365 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337380 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337391 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337407 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337420 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337435 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337448 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337461 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337478 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337496 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337511 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337529 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337540 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337555 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337566 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337576 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337589 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337600 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337615 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.333897 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337626 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337680 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.337699 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.339893 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.339930 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.339962 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.339992 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340020 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340047 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340076 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340104 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340130 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340152 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340250 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340278 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340329 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340367 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340399 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340432 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340456 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340479 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340503 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340526 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340548 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340572 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340595 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340618 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340641 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340665 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340687 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340712 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340737 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340759 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340782 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340808 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340849 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340891 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340924 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340958 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.340988 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341012 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341035 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341058 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341082 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341106 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341191 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341248 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.341278 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.345993 4738 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346051 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346081 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346106 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346132 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346191 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346216 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346236 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346261 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346288 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346315 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346343 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346379 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346414 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346448 4738 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346480 4738 reconstruct.go:97] "Volume reconstruction finished" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.346503 4738 reconciler.go:26] "Reconciler: start to sync state" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.350902 4738 manager.go:324] Recovery completed Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.363668 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.365853 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.365915 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.365924 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.366989 4738 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.367254 4738 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.367383 4738 state_mem.go:36] "Initialized new in-memory state store" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.382266 4738 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.384286 4738 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.384359 4738 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.384398 4738 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.384571 4738 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.385263 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.385353 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.389624 4738 policy_none.go:49] "None policy: Start" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.391143 4738 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.391194 4738 state_mem.go:35] "Initializing new in-memory state store" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.422623 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.473264 4738 manager.go:334] "Starting Device Plugin manager" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.473342 4738 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.473360 4738 server.go:79] "Starting device plugin registration server" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.474011 4738 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.474028 4738 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.474487 4738 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.474584 4738 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.474593 4738 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.485246 4738 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.485354 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.485675 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.486919 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.486955 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.486967 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.487097 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.487470 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.487533 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.487949 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.487992 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.488026 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.488380 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.488618 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.488699 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.489690 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.489729 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.489741 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.489983 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490009 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490036 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.489992 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490095 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490108 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490326 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490500 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.490551 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491629 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491630 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491684 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491698 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491656 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491745 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491856 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.491960 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.492004 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.492814 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.492843 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.492857 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.493294 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.493376 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.493304 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.493449 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.493465 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.494835 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.494877 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.494895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.523806 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549408 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549469 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549504 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549530 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549558 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549579 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549706 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549774 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549803 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549824 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549854 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549878 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549920 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549971 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.549991 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.575740 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.577934 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.578341 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.578455 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.578614 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.579627 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651784 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651854 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651881 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651903 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651924 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651949 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651973 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.651996 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652023 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652045 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652065 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652110 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652133 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652182 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652232 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652269 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652362 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652395 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652412 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652287 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652366 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652232 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652380 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652357 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652241 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652292 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652344 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.652529 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.780276 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.781754 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.781793 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.781802 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.781826 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.782384 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.821215 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.830610 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.854192 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.880362 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.885294 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b4227387aca0c5fd199be68b3f0d719c2ceac2dee002d4c5b5fc78f225ccb6e8 WatchSource:0}: Error finding container b4227387aca0c5fd199be68b3f0d719c2ceac2dee002d4c5b5fc78f225ccb6e8: Status 404 returned error can't find the container with id b4227387aca0c5fd199be68b3f0d719c2ceac2dee002d4c5b5fc78f225ccb6e8 Mar 07 06:59:42 crc kubenswrapper[4738]: I0307 06:59:42.885994 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.902889 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-a4ef603a0893655b1073182b185d2e1c04249c57cd452cbe5ca0c8bf177b7cd6 WatchSource:0}: Error finding container a4ef603a0893655b1073182b185d2e1c04249c57cd452cbe5ca0c8bf177b7cd6: Status 404 returned error can't find the container with id a4ef603a0893655b1073182b185d2e1c04249c57cd452cbe5ca0c8bf177b7cd6 Mar 07 06:59:42 crc kubenswrapper[4738]: W0307 06:59:42.903814 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-39086ef90cc2caadbdec11c458c7bec76873dda833b29bbb614902fdcfa4b988 WatchSource:0}: Error finding container 39086ef90cc2caadbdec11c458c7bec76873dda833b29bbb614902fdcfa4b988: Status 404 returned error can't find the container with id 39086ef90cc2caadbdec11c458c7bec76873dda833b29bbb614902fdcfa4b988 Mar 07 06:59:42 crc kubenswrapper[4738]: E0307 06:59:42.925665 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.182699 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.184646 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.184708 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.184722 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.184762 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.185360 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.322913 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:43 crc kubenswrapper[4738]: W0307 06:59:43.378600 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.378721 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.391718 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"af5f849caea3635510546d7b599542cc2d2e2d92f080e246c054913b932996da"} Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.393757 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"a4ef603a0893655b1073182b185d2e1c04249c57cd452cbe5ca0c8bf177b7cd6"} Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.395080 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"39086ef90cc2caadbdec11c458c7bec76873dda833b29bbb614902fdcfa4b988"} Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.396686 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b4227387aca0c5fd199be68b3f0d719c2ceac2dee002d4c5b5fc78f225ccb6e8"} Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.398271 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2f7485bbdc5c71a7f04c5e7912ea1589f57af3346485b7948395449d46c403f4"} Mar 07 06:59:43 crc kubenswrapper[4738]: W0307 06:59:43.553108 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.553718 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:43 crc kubenswrapper[4738]: W0307 06:59:43.622931 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.623075 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.726522 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Mar 07 06:59:43 crc kubenswrapper[4738]: W0307 06:59:43.774812 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.774947 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.986017 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.988055 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.988424 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.988469 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:43 crc kubenswrapper[4738]: I0307 06:59:43.988566 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:43 crc kubenswrapper[4738]: E0307 06:59:43.990084 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Mar 07 06:59:44 crc kubenswrapper[4738]: E0307 06:59:44.081372 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.299824 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:59:44 crc kubenswrapper[4738]: E0307 06:59:44.301579 4738 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.322647 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.404915 4738 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f" exitCode=0 Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.405104 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.405138 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.406277 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.406331 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.406347 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.407680 4738 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8" exitCode=0 Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.407792 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.407909 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.411585 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.411641 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.411659 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.414753 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.414854 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.414884 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.414807 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.414907 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.416093 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.416126 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.416143 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.418872 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699" exitCode=0 Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.419037 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.419142 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.420929 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.420987 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.421001 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.423376 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.431665 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.431732 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.431754 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.431806 4738 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081" exitCode=0 Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.431909 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081"} Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.432141 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.436679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.436725 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.436742 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:44 crc kubenswrapper[4738]: I0307 06:59:44.973624 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.322781 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.327736 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="3.2s" Mar 07 06:59:45 crc kubenswrapper[4738]: W0307 06:59:45.361790 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.361922 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.436968 4738 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f" exitCode=0 Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.437048 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.437083 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.438082 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.438135 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.438149 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.442824 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.442923 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.449012 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.449052 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.449064 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.451243 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.451340 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.451358 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.451446 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.452681 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.452707 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.452719 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.457710 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.457751 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.457766 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.457778 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114"} Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.457914 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.459874 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.459923 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.459938 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.590417 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.591894 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.591945 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.591955 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:45 crc kubenswrapper[4738]: I0307 06:59:45.591991 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.592736 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.51:6443: connect: connection refused" node="crc" Mar 07 06:59:45 crc kubenswrapper[4738]: W0307 06:59:45.614541 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.614637 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:45 crc kubenswrapper[4738]: W0307 06:59:45.704079 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.704182 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:45 crc kubenswrapper[4738]: W0307 06:59:45.865113 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 06:59:45 crc kubenswrapper[4738]: E0307 06:59:45.865274 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.463944 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9bde46963e160670cd46ca2613aa4614de065a0e1c97c57c55b4b5b4119e40e3"} Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.464104 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.465796 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.465836 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.465854 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.466881 4738 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6" exitCode=0 Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.466975 4738 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.467006 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.467519 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.467905 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6"} Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468002 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468336 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468803 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468836 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468850 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468851 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468904 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468909 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468925 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468942 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.468960 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.469206 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.469238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:46 crc kubenswrapper[4738]: I0307 06:59:46.469253 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.442626 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474605 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6"} Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474657 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474674 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee"} Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474704 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e"} Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474723 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca"} Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.474788 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.475966 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.476013 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.476025 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.525953 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.526194 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.527655 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.527708 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:47 crc kubenswrapper[4738]: I0307 06:59:47.527721 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.130998 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.485136 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c"} Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.485328 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.485354 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487225 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487286 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487307 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487301 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487378 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.487413 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.625408 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.792879 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.795291 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.795342 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.795355 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:48 crc kubenswrapper[4738]: I0307 06:59:48.795397 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.188393 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.488842 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.488859 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490382 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490504 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490543 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490382 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490576 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:49 crc kubenswrapper[4738]: I0307 06:59:49.490592 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.265246 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.265490 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.267256 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.267353 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.267377 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.491877 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.493285 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.493350 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:50 crc kubenswrapper[4738]: I0307 06:59:50.493363 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.957330 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.957661 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.959404 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.959459 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.959469 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:51 crc kubenswrapper[4738]: I0307 06:59:51.965446 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:52 crc kubenswrapper[4738]: E0307 06:59:52.486361 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 06:59:52 crc kubenswrapper[4738]: I0307 06:59:52.496886 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:52 crc kubenswrapper[4738]: I0307 06:59:52.497766 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:52 crc kubenswrapper[4738]: I0307 06:59:52.497835 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:52 crc kubenswrapper[4738]: I0307 06:59:52.497850 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.404683 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.500008 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.501291 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.501406 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.501426 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:53 crc kubenswrapper[4738]: I0307 06:59:53.505592 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.503480 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.504883 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.504936 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.504955 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.990870 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.991141 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.992655 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.992692 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:54 crc kubenswrapper[4738]: I0307 06:59:54.992704 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:56 crc kubenswrapper[4738]: W0307 06:59:56.250659 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.250800 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:59:56 crc kubenswrapper[4738]: W0307 06:59:56.251230 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.251334 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:59:56 crc kubenswrapper[4738]: W0307 06:59:56.252798 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.252898 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.254244 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: W0307 06:59:56.254859 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.254932 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.259309 4738 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.261018 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.262066 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.263814 4738 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.263872 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 07 06:59:56 crc kubenswrapper[4738]: E0307 06:59:56.264849 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.268668 4738 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.268727 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.325124 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:56Z is after 2026-02-23T05:33:13Z Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.405763 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.405897 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.511277 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.512937 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9bde46963e160670cd46ca2613aa4614de065a0e1c97c57c55b4b5b4119e40e3" exitCode=255 Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.512996 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9bde46963e160670cd46ca2613aa4614de065a0e1c97c57c55b4b5b4119e40e3"} Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.513183 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.514071 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.514148 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.514197 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:56 crc kubenswrapper[4738]: I0307 06:59:56.514978 4738 scope.go:117] "RemoveContainer" containerID="9bde46963e160670cd46ca2613aa4614de065a0e1c97c57c55b4b5b4119e40e3" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.100334 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.327104 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:57Z is after 2026-02-23T05:33:13Z Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.448494 4738 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]log ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]etcd ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/generic-apiserver-start-informers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/priority-and-fairness-filter ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-apiextensions-informers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-apiextensions-controllers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/crd-informer-synced ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-system-namespaces-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/bootstrap-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/start-kube-aggregator-informers ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-registration-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-discovery-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]autoregister-completion ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-openapi-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 07 06:59:57 crc kubenswrapper[4738]: livez check failed Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.448569 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.518278 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.519084 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.521828 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" exitCode=255 Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.521886 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d"} Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.521968 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.522092 4738 scope.go:117] "RemoveContainer" containerID="9bde46963e160670cd46ca2613aa4614de065a0e1c97c57c55b4b5b4119e40e3" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.522999 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.523039 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.523050 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:57 crc kubenswrapper[4738]: I0307 06:59:57.523603 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 06:59:57 crc kubenswrapper[4738]: E0307 06:59:57.523833 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.326461 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:58Z is after 2026-02-23T05:33:13Z Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.527074 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.530498 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.531446 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.531482 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.531493 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 06:59:58 crc kubenswrapper[4738]: I0307 06:59:58.532110 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 06:59:58 crc kubenswrapper[4738]: E0307 06:59:58.532328 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 06:59:59 crc kubenswrapper[4738]: I0307 06:59:59.324604 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T06:59:59Z is after 2026-02-23T05:33:13Z Mar 07 07:00:00 crc kubenswrapper[4738]: I0307 07:00:00.324321 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:00Z is after 2026-02-23T05:33:13Z Mar 07 07:00:01 crc kubenswrapper[4738]: I0307 07:00:01.324599 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:01Z is after 2026-02-23T05:33:13Z Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.325124 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:02Z is after 2026-02-23T05:33:13Z Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.450066 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.450316 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.452676 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.452736 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.452752 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.453558 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:02 crc kubenswrapper[4738]: E0307 07:00:02.453795 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.456054 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:02 crc kubenswrapper[4738]: E0307 07:00:02.486925 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.543047 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.544270 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.544304 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.544315 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.544929 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:02 crc kubenswrapper[4738]: E0307 07:00:02.545148 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.665068 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.666606 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.666647 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.666662 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:02 crc kubenswrapper[4738]: I0307 07:00:02.666697 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:02 crc kubenswrapper[4738]: E0307 07:00:02.666782 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:02Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:02 crc kubenswrapper[4738]: E0307 07:00:02.669499 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:02Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:03 crc kubenswrapper[4738]: I0307 07:00:03.325542 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:03Z is after 2026-02-23T05:33:13Z Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.268918 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 07:00:04 crc kubenswrapper[4738]: E0307 07:00:04.272647 4738 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.300317 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.300667 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.302273 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.302331 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.302347 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.303080 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:04 crc kubenswrapper[4738]: E0307 07:00:04.303289 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:04 crc kubenswrapper[4738]: I0307 07:00:04.325354 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z Mar 07 07:00:04 crc kubenswrapper[4738]: W0307 07:00:04.682734 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z Mar 07 07:00:04 crc kubenswrapper[4738]: E0307 07:00:04.682833 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:04 crc kubenswrapper[4738]: W0307 07:00:04.747900 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z Mar 07 07:00:04 crc kubenswrapper[4738]: E0307 07:00:04.747990 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.015551 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.015916 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.017999 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.018058 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.018070 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.027303 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.324726 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:05Z is after 2026-02-23T05:33:13Z Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.550108 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.551044 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.551077 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:05 crc kubenswrapper[4738]: I0307 07:00:05.551087 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:06 crc kubenswrapper[4738]: E0307 07:00:06.266004 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:06Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:06 crc kubenswrapper[4738]: I0307 07:00:06.326402 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:06Z is after 2026-02-23T05:33:13Z Mar 07 07:00:06 crc kubenswrapper[4738]: I0307 07:00:06.404726 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 07:00:06 crc kubenswrapper[4738]: I0307 07:00:06.404853 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 07:00:06 crc kubenswrapper[4738]: W0307 07:00:06.492499 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:06Z is after 2026-02-23T05:33:13Z Mar 07 07:00:06 crc kubenswrapper[4738]: E0307 07:00:06.492631 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.101290 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.101523 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.102914 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.103002 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.103029 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:07 crc kubenswrapper[4738]: I0307 07:00:07.103900 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:07 crc kubenswrapper[4738]: E0307 07:00:07.104208 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:08 crc kubenswrapper[4738]: W0307 07:00:08.531647 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:08Z is after 2026-02-23T05:33:13Z Mar 07 07:00:08 crc kubenswrapper[4738]: E0307 07:00:08.531733 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:08 crc kubenswrapper[4738]: I0307 07:00:08.532100 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:08Z is after 2026-02-23T05:33:13Z Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.324671 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:09Z is after 2026-02-23T05:33:13Z Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.669925 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.671642 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.671683 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.671692 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:09 crc kubenswrapper[4738]: I0307 07:00:09.671720 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:09 crc kubenswrapper[4738]: E0307 07:00:09.672809 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:09Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:09 crc kubenswrapper[4738]: E0307 07:00:09.677285 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:09Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:10 crc kubenswrapper[4738]: I0307 07:00:10.326115 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:10Z is after 2026-02-23T05:33:13Z Mar 07 07:00:11 crc kubenswrapper[4738]: I0307 07:00:11.327114 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:11Z is after 2026-02-23T05:33:13Z Mar 07 07:00:12 crc kubenswrapper[4738]: I0307 07:00:12.325357 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:12Z is after 2026-02-23T05:33:13Z Mar 07 07:00:12 crc kubenswrapper[4738]: E0307 07:00:12.487079 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:13 crc kubenswrapper[4738]: I0307 07:00:13.326214 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:13Z is after 2026-02-23T05:33:13Z Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.327460 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:14Z is after 2026-02-23T05:33:13Z Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.891346 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:33976->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.891444 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:33976->192.168.126.11:10357: read: connection reset by peer" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.891531 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.891749 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.893460 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.893539 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.893567 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.894602 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 07:00:14 crc kubenswrapper[4738]: I0307 07:00:14.894979 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92" gracePeriod=30 Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.326874 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:15Z is after 2026-02-23T05:33:13Z Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.555946 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.557512 4738 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92" exitCode=255 Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.557650 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92"} Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.557741 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99"} Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.557912 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.559093 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.559174 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:15 crc kubenswrapper[4738]: I0307 07:00:15.559188 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:16 crc kubenswrapper[4738]: E0307 07:00:16.272766 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:16Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.325413 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:16Z is after 2026-02-23T05:33:13Z Mar 07 07:00:16 crc kubenswrapper[4738]: E0307 07:00:16.677100 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:16Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.678056 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.679491 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.679574 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.679600 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:16 crc kubenswrapper[4738]: I0307 07:00:16.679654 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:16 crc kubenswrapper[4738]: E0307 07:00:16.682551 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:16Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.324856 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:17Z is after 2026-02-23T05:33:13Z Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.526575 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.526799 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.528460 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.528524 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:17 crc kubenswrapper[4738]: I0307 07:00:17.528551 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:18 crc kubenswrapper[4738]: I0307 07:00:18.324780 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:18Z is after 2026-02-23T05:33:13Z Mar 07 07:00:19 crc kubenswrapper[4738]: I0307 07:00:19.324932 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:19Z is after 2026-02-23T05:33:13Z Mar 07 07:00:20 crc kubenswrapper[4738]: I0307 07:00:20.325770 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:20Z is after 2026-02-23T05:33:13Z Mar 07 07:00:20 crc kubenswrapper[4738]: I0307 07:00:20.624720 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 07:00:20 crc kubenswrapper[4738]: E0307 07:00:20.629394 4738 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:20 crc kubenswrapper[4738]: E0307 07:00:20.630620 4738 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.326928 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:21Z is after 2026-02-23T05:33:13Z Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.385782 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.387474 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.387533 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.387581 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:21 crc kubenswrapper[4738]: I0307 07:00:21.388635 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:22 crc kubenswrapper[4738]: W0307 07:00:22.327621 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:22Z is after 2026-02-23T05:33:13Z Mar 07 07:00:22 crc kubenswrapper[4738]: E0307 07:00:22.327742 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:22Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.328050 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:22Z is after 2026-02-23T05:33:13Z Mar 07 07:00:22 crc kubenswrapper[4738]: E0307 07:00:22.487325 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.578728 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.579404 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.581382 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" exitCode=255 Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.581432 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5"} Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.581490 4738 scope.go:117] "RemoveContainer" containerID="9e273ba5660d1326a78a9a4cfb5dce2571c6edaa5bac3e33d0e555755e0fbb4d" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.581626 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.582477 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.582506 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.582519 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:22 crc kubenswrapper[4738]: I0307 07:00:22.583061 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:22 crc kubenswrapper[4738]: E0307 07:00:22.583368 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.326691 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:23Z is after 2026-02-23T05:33:13Z Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.404987 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.405318 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.406732 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.406768 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.406781 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.586705 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.682792 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:23 crc kubenswrapper[4738]: E0307 07:00:23.683361 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:23Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.684389 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.684627 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.684800 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:23 crc kubenswrapper[4738]: I0307 07:00:23.685006 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:23 crc kubenswrapper[4738]: E0307 07:00:23.687986 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:23Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:23 crc kubenswrapper[4738]: W0307 07:00:23.765404 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:23Z is after 2026-02-23T05:33:13Z Mar 07 07:00:23 crc kubenswrapper[4738]: E0307 07:00:23.765537 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.299582 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.300329 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.301869 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.301899 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.301913 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.302521 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:24 crc kubenswrapper[4738]: E0307 07:00:24.302698 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:24 crc kubenswrapper[4738]: I0307 07:00:24.325880 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:24Z is after 2026-02-23T05:33:13Z Mar 07 07:00:25 crc kubenswrapper[4738]: I0307 07:00:25.325062 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:25Z is after 2026-02-23T05:33:13Z Mar 07 07:00:26 crc kubenswrapper[4738]: E0307 07:00:26.277472 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:26Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:26 crc kubenswrapper[4738]: I0307 07:00:26.324839 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:26Z is after 2026-02-23T05:33:13Z Mar 07 07:00:26 crc kubenswrapper[4738]: I0307 07:00:26.404959 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 07:00:26 crc kubenswrapper[4738]: I0307 07:00:26.405073 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.100923 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.101196 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.102522 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.102560 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.102570 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.103191 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:27 crc kubenswrapper[4738]: E0307 07:00:27.103429 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:27 crc kubenswrapper[4738]: I0307 07:00:27.338971 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:27Z is after 2026-02-23T05:33:13Z Mar 07 07:00:27 crc kubenswrapper[4738]: W0307 07:00:27.768095 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:27Z is after 2026-02-23T05:33:13Z Mar 07 07:00:27 crc kubenswrapper[4738]: E0307 07:00:27.768215 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:28 crc kubenswrapper[4738]: I0307 07:00:28.325494 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:28Z is after 2026-02-23T05:33:13Z Mar 07 07:00:29 crc kubenswrapper[4738]: I0307 07:00:29.325066 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:29Z is after 2026-02-23T05:33:13Z Mar 07 07:00:29 crc kubenswrapper[4738]: W0307 07:00:29.706353 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:29Z is after 2026-02-23T05:33:13Z Mar 07 07:00:29 crc kubenswrapper[4738]: E0307 07:00:29.706448 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.326055 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:30Z is after 2026-02-23T05:33:13Z Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.364425 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.364636 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.365913 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.365948 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.365957 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.689009 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:30 crc kubenswrapper[4738]: E0307 07:00:30.689202 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:30Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.690612 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.690677 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.690700 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:30 crc kubenswrapper[4738]: I0307 07:00:30.690751 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:30 crc kubenswrapper[4738]: E0307 07:00:30.695714 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:30Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:31 crc kubenswrapper[4738]: I0307 07:00:31.326738 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:31Z is after 2026-02-23T05:33:13Z Mar 07 07:00:32 crc kubenswrapper[4738]: I0307 07:00:32.326637 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:32Z is after 2026-02-23T05:33:13Z Mar 07 07:00:32 crc kubenswrapper[4738]: E0307 07:00:32.487463 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:33 crc kubenswrapper[4738]: I0307 07:00:33.326533 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:33Z is after 2026-02-23T05:33:13Z Mar 07 07:00:34 crc kubenswrapper[4738]: I0307 07:00:34.326380 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:34Z is after 2026-02-23T05:33:13Z Mar 07 07:00:35 crc kubenswrapper[4738]: I0307 07:00:35.325529 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:35Z is after 2026-02-23T05:33:13Z Mar 07 07:00:36 crc kubenswrapper[4738]: E0307 07:00:36.282242 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:36Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:36 crc kubenswrapper[4738]: I0307 07:00:36.326808 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:36Z is after 2026-02-23T05:33:13Z Mar 07 07:00:36 crc kubenswrapper[4738]: I0307 07:00:36.404928 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 07:00:36 crc kubenswrapper[4738]: I0307 07:00:36.405042 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.325130 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:37Z is after 2026-02-23T05:33:13Z Mar 07 07:00:37 crc kubenswrapper[4738]: E0307 07:00:37.695190 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:37Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.696392 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.698355 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.698421 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.698440 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:37 crc kubenswrapper[4738]: I0307 07:00:37.698484 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:37 crc kubenswrapper[4738]: E0307 07:00:37.703988 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:37Z is after 2026-02-23T05:33:13Z" node="crc" Mar 07 07:00:38 crc kubenswrapper[4738]: I0307 07:00:38.327660 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:38Z is after 2026-02-23T05:33:13Z Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.325204 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:00:39Z is after 2026-02-23T05:33:13Z Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.385381 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.386527 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.386581 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.386594 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:39 crc kubenswrapper[4738]: I0307 07:00:39.387383 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:39 crc kubenswrapper[4738]: E0307 07:00:39.387602 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:40 crc kubenswrapper[4738]: I0307 07:00:40.328581 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:41 crc kubenswrapper[4738]: I0307 07:00:41.328037 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:42 crc kubenswrapper[4738]: I0307 07:00:42.328508 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:42 crc kubenswrapper[4738]: E0307 07:00:42.487635 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:43 crc kubenswrapper[4738]: I0307 07:00:43.329282 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.324898 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:44 crc kubenswrapper[4738]: E0307 07:00:44.701085 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.704282 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.705731 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.705770 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.705781 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:44 crc kubenswrapper[4738]: I0307 07:00:44.705804 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:44 crc kubenswrapper[4738]: E0307 07:00:44.709643 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.330651 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.722369 4738 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55024->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.722460 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:55024->192.168.126.11:10357: read: connection reset by peer" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.722560 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.722774 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.724675 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.725079 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.725096 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.725664 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 07:00:45 crc kubenswrapper[4738]: I0307 07:00:45.725751 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99" gracePeriod=30 Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.289081 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20a205e2a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,LastTimestamp:2026-03-07 06:59:42.317534762 +0000 UTC m=+0.782522163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.294561 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.300036 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.304813 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.309566 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.486942518 +0000 UTC m=+0.951929839,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.315425 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.486964555 +0000 UTC m=+0.951951876,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.322101 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.486972254 +0000 UTC m=+0.951959575,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.327546 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.487976221 +0000 UTC m=+0.952963532,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.328295 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.331509 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.488000388 +0000 UTC m=+0.952987709,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.335737 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.488035093 +0000 UTC m=+0.953022414,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.340750 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.489713362 +0000 UTC m=+0.954700683,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.346232 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.489736369 +0000 UTC m=+0.954723700,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.351560 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.489747458 +0000 UTC m=+0.954734789,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.356564 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.490003234 +0000 UTC m=+0.954990555,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.360898 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.490014333 +0000 UTC m=+0.955001654,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.365105 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.490041969 +0000 UTC m=+0.955029290,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.370023 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.490087663 +0000 UTC m=+0.955074984,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.372411 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.490101791 +0000 UTC m=+0.955089112,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.377148 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.490113149 +0000 UTC m=+0.955100470,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.381380 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf2146c4beb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.490282987 +0000 UTC m=+0.955270318,LastTimestamp:2026-03-07 06:59:42.490282987 +0000 UTC m=+0.955270318,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.384900 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.491646927 +0000 UTC m=+0.956634248,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.388538 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d025c2c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d025c2c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.36589982 +0000 UTC m=+0.830887141,LastTimestamp:2026-03-07 06:59:42.491671223 +0000 UTC m=+0.956658544,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.392373 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.4916936 +0000 UTC m=+0.956680921,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.396230 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02d109\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02d109 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365929737 +0000 UTC m=+0.830917058,LastTimestamp:2026-03-07 06:59:42.491704649 +0000 UTC m=+0.956691970,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.400069 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189a7cf20d02ae46\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189a7cf20d02ae46 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.365920838 +0000 UTC m=+0.830908159,LastTimestamp:2026-03-07 06:59:42.491738774 +0000 UTC m=+0.956726095,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.407774 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf22c0c022c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.886625836 +0000 UTC m=+1.351613197,LastTimestamp:2026-03-07 06:59:42.886625836 +0000 UTC m=+1.351613197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.411983 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf22c1d47f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.887757817 +0000 UTC m=+1.352745178,LastTimestamp:2026-03-07 06:59:42.887757817 +0000 UTC m=+1.352745178,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.416252 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf22c1f0a58 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.887873112 +0000 UTC m=+1.352860473,LastTimestamp:2026-03-07 06:59:42.887873112 +0000 UTC m=+1.352860473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.430828 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf22d5fe077 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.908899447 +0000 UTC m=+1.373886768,LastTimestamp:2026-03-07 06:59:42.908899447 +0000 UTC m=+1.373886768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.435946 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf22d635ac7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:42.909127367 +0000 UTC m=+1.374114688,LastTimestamp:2026-03-07 06:59:42.909127367 +0000 UTC m=+1.374114688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.440968 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf2510ca394 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.507424148 +0000 UTC m=+1.972411469,LastTimestamp:2026-03-07 06:59:43.507424148 +0000 UTC m=+1.972411469,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.446236 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2510e3ade openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.507528414 +0000 UTC m=+1.972515725,LastTimestamp:2026-03-07 06:59:43.507528414 +0000 UTC m=+1.972515725,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.451479 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf25111dd4a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.507766602 +0000 UTC m=+1.972753923,LastTimestamp:2026-03-07 06:59:43.507766602 +0000 UTC m=+1.972753923,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.455285 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf25115bb3d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.508020029 +0000 UTC m=+1.973007350,LastTimestamp:2026-03-07 06:59:43.508020029 +0000 UTC m=+1.973007350,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.461366 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf2512476da openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.508985562 +0000 UTC m=+1.973972893,LastTimestamp:2026-03-07 06:59:43.508985562 +0000 UTC m=+1.973972893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.465280 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf251caacdc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.519878364 +0000 UTC m=+1.984865685,LastTimestamp:2026-03-07 06:59:43.519878364 +0000 UTC m=+1.984865685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.475091 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf251df5bd3 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.521233875 +0000 UTC m=+1.986221196,LastTimestamp:2026-03-07 06:59:43.521233875 +0000 UTC m=+1.986221196,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.479185 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf251eb4eea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.522017002 +0000 UTC m=+1.987004323,LastTimestamp:2026-03-07 06:59:43.522017002 +0000 UTC m=+1.987004323,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.483847 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf251efa4f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.522301174 +0000 UTC m=+1.987288495,LastTimestamp:2026-03-07 06:59:43.522301174 +0000 UTC m=+1.987288495,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.488043 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf251f2a146 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.522496838 +0000 UTC m=+1.987484159,LastTimestamp:2026-03-07 06:59:43.522496838 +0000 UTC m=+1.987484159,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.492049 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf252002dc4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.523384772 +0000 UTC m=+1.988372113,LastTimestamp:2026-03-07 06:59:43.523384772 +0000 UTC m=+1.988372113,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.496946 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf26538993d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.845849405 +0000 UTC m=+2.310836726,LastTimestamp:2026-03-07 06:59:43.845849405 +0000 UTC m=+2.310836726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.501272 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf26612304e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.86010939 +0000 UTC m=+2.325096731,LastTimestamp:2026-03-07 06:59:43.86010939 +0000 UTC m=+2.325096731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.505671 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf2662931d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.861617106 +0000 UTC m=+2.326604447,LastTimestamp:2026-03-07 06:59:43.861617106 +0000 UTC m=+2.326604447,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.511117 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf2738124ba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.08548473 +0000 UTC m=+2.550472061,LastTimestamp:2026-03-07 06:59:44.08548473 +0000 UTC m=+2.550472061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.515641 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf27422ad68 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.096071016 +0000 UTC m=+2.561058337,LastTimestamp:2026-03-07 06:59:44.096071016 +0000 UTC m=+2.561058337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.521003 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf27447245a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.098460762 +0000 UTC m=+2.563448083,LastTimestamp:2026-03-07 06:59:44.098460762 +0000 UTC m=+2.563448083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.526958 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf282b87c20 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.340769824 +0000 UTC m=+2.805757145,LastTimestamp:2026-03-07 06:59:44.340769824 +0000 UTC m=+2.805757145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.531542 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf283747ac7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.353090247 +0000 UTC m=+2.818077568,LastTimestamp:2026-03-07 06:59:44.353090247 +0000 UTC m=+2.818077568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.535828 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf286d44f6f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.409702255 +0000 UTC m=+2.874689586,LastTimestamp:2026-03-07 06:59:44.409702255 +0000 UTC m=+2.874689586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.543874 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2871c3031 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.414412849 +0000 UTC m=+2.879400200,LastTimestamp:2026-03-07 06:59:44.414412849 +0000 UTC m=+2.879400200,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.549323 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2879fd314 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.423039764 +0000 UTC m=+2.888027125,LastTimestamp:2026-03-07 06:59:44.423039764 +0000 UTC m=+2.888027125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.556283 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf288c4740e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.442217486 +0000 UTC m=+2.907204847,LastTimestamp:2026-03-07 06:59:44.442217486 +0000 UTC m=+2.907204847,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.561639 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf294c92358 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.643851096 +0000 UTC m=+3.108838407,LastTimestamp:2026-03-07 06:59:44.643851096 +0000 UTC m=+3.108838407,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.566596 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf294d823df openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.644834271 +0000 UTC m=+3.109821592,LastTimestamp:2026-03-07 06:59:44.644834271 +0000 UTC m=+3.109821592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.572849 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2950095f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.647484917 +0000 UTC m=+3.112472238,LastTimestamp:2026-03-07 06:59:44.647484917 +0000 UTC m=+3.112472238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.579144 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189a7cf2964388c5 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.668649669 +0000 UTC m=+3.133636990,LastTimestamp:2026-03-07 06:59:44.668649669 +0000 UTC m=+3.133636990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.583198 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf29662da4c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.670702156 +0000 UTC m=+3.135689477,LastTimestamp:2026-03-07 06:59:44.670702156 +0000 UTC m=+3.135689477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.587954 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf296746532 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.671851826 +0000 UTC m=+3.136839147,LastTimestamp:2026-03-07 06:59:44.671851826 +0000 UTC m=+3.136839147,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.594491 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf296c676b9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.677230265 +0000 UTC m=+3.142217596,LastTimestamp:2026-03-07 06:59:44.677230265 +0000 UTC m=+3.142217596,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.599817 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf29763b636 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.68753567 +0000 UTC m=+3.152523001,LastTimestamp:2026-03-07 06:59:44.68753567 +0000 UTC m=+3.152523001,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.605189 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf29764878a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.687589258 +0000 UTC m=+3.152576589,LastTimestamp:2026-03-07 06:59:44.687589258 +0000 UTC m=+3.152576589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.609713 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf2994e451e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.719684894 +0000 UTC m=+3.184672215,LastTimestamp:2026-03-07 06:59:44.719684894 +0000 UTC m=+3.184672215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.614140 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2a3568c86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.887999622 +0000 UTC m=+3.352986943,LastTimestamp:2026-03-07 06:59:44.887999622 +0000 UTC m=+3.352986943,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.618293 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2a37ee81d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.890644509 +0000 UTC m=+3.355631830,LastTimestamp:2026-03-07 06:59:44.890644509 +0000 UTC m=+3.355631830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.622915 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2a42c51bd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.902009277 +0000 UTC m=+3.366996608,LastTimestamp:2026-03-07 06:59:44.902009277 +0000 UTC m=+3.366996608,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.628300 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2a43d69c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.903129537 +0000 UTC m=+3.368116858,LastTimestamp:2026-03-07 06:59:44.903129537 +0000 UTC m=+3.368116858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.633328 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2a4675841 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.905877569 +0000 UTC m=+3.370864900,LastTimestamp:2026-03-07 06:59:44.905877569 +0000 UTC m=+3.370864900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.637593 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2a4743daa openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:44.90672273 +0000 UTC m=+3.371710051,LastTimestamp:2026-03-07 06:59:44.90672273 +0000 UTC m=+3.371710051,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.642048 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2b005e1c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.100816836 +0000 UTC m=+3.565804177,LastTimestamp:2026-03-07 06:59:45.100816836 +0000 UTC m=+3.565804177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.647374 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2b04e3391 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.105556369 +0000 UTC m=+3.570543680,LastTimestamp:2026-03-07 06:59:45.105556369 +0000 UTC m=+3.570543680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.652218 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2b11ea491 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.119216785 +0000 UTC m=+3.584204106,LastTimestamp:2026-03-07 06:59:45.119216785 +0000 UTC m=+3.584204106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.655609 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2b1363be3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.120762851 +0000 UTC m=+3.585750172,LastTimestamp:2026-03-07 06:59:45.120762851 +0000 UTC m=+3.585750172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.658942 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189a7cf2b14a48ca openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.122076874 +0000 UTC m=+3.587064205,LastTimestamp:2026-03-07 06:59:45.122076874 +0000 UTC m=+3.587064205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.662568 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2bbafdee5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.296506597 +0000 UTC m=+3.761493918,LastTimestamp:2026-03-07 06:59:45.296506597 +0000 UTC m=+3.761493918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.663745 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.664715 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.665029 4738 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99" exitCode=255 Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.665078 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99"} Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.665114 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722"} Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.665137 4738 scope.go:117] "RemoveContainer" containerID="279f12169f448af350bf9dc33c1af84d8f6f9ca8c029f85cd091109043448c92" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.665378 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.667392 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2bc5993eb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.307628523 +0000 UTC m=+3.772615844,LastTimestamp:2026-03-07 06:59:45.307628523 +0000 UTC m=+3.772615844,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.667563 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.667597 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:46 crc kubenswrapper[4738]: I0307 07:00:46.667610 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.671360 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2bc6c6cbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.308863679 +0000 UTC m=+3.773851000,LastTimestamp:2026-03-07 06:59:45.308863679 +0000 UTC m=+3.773851000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.675388 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf2c4457237 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.440526903 +0000 UTC m=+3.905514224,LastTimestamp:2026-03-07 06:59:45.440526903 +0000 UTC m=+3.905514224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.679697 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2cad6bc63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.550711907 +0000 UTC m=+4.015699228,LastTimestamp:2026-03-07 06:59:45.550711907 +0000 UTC m=+4.015699228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.686388 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2cb9c076e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.56364171 +0000 UTC m=+4.028629031,LastTimestamp:2026-03-07 06:59:45.56364171 +0000 UTC m=+4.028629031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.687856 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf2d1cba07c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.66742438 +0000 UTC m=+4.132411701,LastTimestamp:2026-03-07 06:59:45.66742438 +0000 UTC m=+4.132411701,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.691619 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf2d26c9cb3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.677974707 +0000 UTC m=+4.142962028,LastTimestamp:2026-03-07 06:59:45.677974707 +0000 UTC m=+4.142962028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.695913 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf301aae257 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.470584919 +0000 UTC m=+4.935572240,LastTimestamp:2026-03-07 06:59:46.470584919 +0000 UTC m=+4.935572240,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.699328 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf30cd6826a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.657993322 +0000 UTC m=+5.122980643,LastTimestamp:2026-03-07 06:59:46.657993322 +0000 UTC m=+5.122980643,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.702876 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf30d5773a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.666443683 +0000 UTC m=+5.131431004,LastTimestamp:2026-03-07 06:59:46.666443683 +0000 UTC m=+5.131431004,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.706211 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf30d6a4a3d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.667678269 +0000 UTC m=+5.132665590,LastTimestamp:2026-03-07 06:59:46.667678269 +0000 UTC m=+5.132665590,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.709946 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf31a2aeb06 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.881628934 +0000 UTC m=+5.346616295,LastTimestamp:2026-03-07 06:59:46.881628934 +0000 UTC m=+5.346616295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.715223 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf31adf01ba openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.893431226 +0000 UTC m=+5.358418557,LastTimestamp:2026-03-07 06:59:46.893431226 +0000 UTC m=+5.358418557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.719174 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf31af24bad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:46.894695341 +0000 UTC m=+5.359682662,LastTimestamp:2026-03-07 06:59:46.894695341 +0000 UTC m=+5.359682662,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.723577 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf32733499a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.100281242 +0000 UTC m=+5.565268573,LastTimestamp:2026-03-07 06:59:47.100281242 +0000 UTC m=+5.565268573,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.727704 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf32853bd47 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.119185223 +0000 UTC m=+5.584172544,LastTimestamp:2026-03-07 06:59:47.119185223 +0000 UTC m=+5.584172544,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.731432 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf3287298b9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.121207481 +0000 UTC m=+5.586194802,LastTimestamp:2026-03-07 06:59:47.121207481 +0000 UTC m=+5.586194802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.735281 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf335603622 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.338106402 +0000 UTC m=+5.803093723,LastTimestamp:2026-03-07 06:59:47.338106402 +0000 UTC m=+5.803093723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.739140 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf336364e46 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.352137286 +0000 UTC m=+5.817124607,LastTimestamp:2026-03-07 06:59:47.352137286 +0000 UTC m=+5.817124607,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.745276 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf3364ea6ab openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.353732779 +0000 UTC m=+5.818720130,LastTimestamp:2026-03-07 06:59:47.353732779 +0000 UTC m=+5.818720130,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.749996 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf3432d7ced openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.569663213 +0000 UTC m=+6.034650534,LastTimestamp:2026-03-07 06:59:47.569663213 +0000 UTC m=+6.034650534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.755433 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189a7cf3440b4529 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:47.584197929 +0000 UTC m=+6.049185250,LastTimestamp:2026-03-07 06:59:47.584197929 +0000 UTC m=+6.049185250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.763026 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-apiserver-crc.189a7cf5496457d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 07 07:00:46 crc kubenswrapper[4738]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 07:00:46 crc kubenswrapper[4738]: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.263856081 +0000 UTC m=+14.728843422,LastTimestamp:2026-03-07 06:59:56.263856081 +0000 UTC m=+14.728843422,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.770454 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf5496512ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.263903983 +0000 UTC m=+14.728891314,LastTimestamp:2026-03-07 06:59:56.263903983 +0000 UTC m=+14.728891314,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.803786 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7cf5496457d1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-apiserver-crc.189a7cf5496457d1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 07 07:00:46 crc kubenswrapper[4738]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 07 07:00:46 crc kubenswrapper[4738]: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.263856081 +0000 UTC m=+14.728843422,LastTimestamp:2026-03-07 06:59:56.268712058 +0000 UTC m=+14.733699379,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.806203 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7cf5496512ef\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf5496512ef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.263903983 +0000 UTC m=+14.728891314,LastTimestamp:2026-03-07 06:59:56.268749229 +0000 UTC m=+14.733736550,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.809417 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551daeb30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 07:00:46 crc kubenswrapper[4738]: body: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405844784 +0000 UTC m=+14.870832125,LastTimestamp:2026-03-07 06:59:56.405844784 +0000 UTC m=+14.870832125,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.814067 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551dc353c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405929276 +0000 UTC m=+14.870916607,LastTimestamp:2026-03-07 06:59:56.405929276 +0000 UTC m=+14.870916607,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.816496 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7cf2bc6c6cbf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2bc6c6cbf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.308863679 +0000 UTC m=+3.773851000,LastTimestamp:2026-03-07 06:59:56.519150569 +0000 UTC m=+14.984137900,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.821009 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7cf2cad6bc63\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2cad6bc63 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.550711907 +0000 UTC m=+4.015699228,LastTimestamp:2026-03-07 06:59:56.740201625 +0000 UTC m=+15.205188946,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.826057 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189a7cf2cb9c076e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189a7cf2cb9c076e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:45.56364171 +0000 UTC m=+4.028629031,LastTimestamp:2026-03-07 06:59:56.75068468 +0000 UTC m=+15.215672001,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.831660 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf551daeb30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551daeb30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 07:00:46 crc kubenswrapper[4738]: body: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405844784 +0000 UTC m=+14.870832125,LastTimestamp:2026-03-07 07:00:06.40481379 +0000 UTC m=+24.869801151,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.836419 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf551dc353c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551dc353c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405929276 +0000 UTC m=+14.870916607,LastTimestamp:2026-03-07 07:00:06.404909063 +0000 UTC m=+24.869896424,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.842001 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7cf99fae5bb4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:33976->192.168.126.11:10357: read: connection reset by peer Mar 07 07:00:46 crc kubenswrapper[4738]: body: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:00:14.8914165 +0000 UTC m=+33.356403861,LastTimestamp:2026-03-07 07:00:14.8914165 +0000 UTC m=+33.356403861,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.853870 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf99faf706e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:33976->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:00:14.891487342 +0000 UTC m=+33.356474693,LastTimestamp:2026-03-07 07:00:14.891487342 +0000 UTC m=+33.356474693,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.859456 4738 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf99fe3dcd9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:00:14.894922969 +0000 UTC m=+33.359910380,LastTimestamp:2026-03-07 07:00:14.894922969 +0000 UTC m=+33.359910380,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.864038 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf251efa4f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf251efa4f6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.522301174 +0000 UTC m=+1.987288495,LastTimestamp:2026-03-07 07:00:14.916953478 +0000 UTC m=+33.381940809,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.867889 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf26538993d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf26538993d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.845849405 +0000 UTC m=+2.310836726,LastTimestamp:2026-03-07 07:00:15.138647192 +0000 UTC m=+33.603634503,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.871327 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf26612304e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf26612304e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:43.86010939 +0000 UTC m=+2.325096731,LastTimestamp:2026-03-07 07:00:15.148018685 +0000 UTC m=+33.613006006,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.876023 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf551daeb30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551daeb30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 07:00:46 crc kubenswrapper[4738]: body: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405844784 +0000 UTC m=+14.870832125,LastTimestamp:2026-03-07 07:00:26.405048901 +0000 UTC m=+44.870036222,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.880956 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf551dc353c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551dc353c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405929276 +0000 UTC m=+14.870916607,LastTimestamp:2026-03-07 07:00:26.405105912 +0000 UTC m=+44.870093233,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:00:46 crc kubenswrapper[4738]: E0307 07:00:46.892694 4738 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189a7cf551daeb30\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 07 07:00:46 crc kubenswrapper[4738]: &Event{ObjectMeta:{kube-controller-manager-crc.189a7cf551daeb30 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 07 07:00:46 crc kubenswrapper[4738]: body: Mar 07 07:00:46 crc kubenswrapper[4738]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 06:59:56.405844784 +0000 UTC m=+14.870832125,LastTimestamp:2026-03-07 07:00:36.405016164 +0000 UTC m=+54.870003525,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 07 07:00:46 crc kubenswrapper[4738]: > Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.325878 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.526234 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.668581 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.669494 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.670289 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.670318 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:47 crc kubenswrapper[4738]: I0307 07:00:47.670327 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:48 crc kubenswrapper[4738]: I0307 07:00:48.326897 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:49 crc kubenswrapper[4738]: I0307 07:00:49.327610 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:50 crc kubenswrapper[4738]: I0307 07:00:50.326388 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.326603 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.385730 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.386988 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.387024 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.387035 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.387765 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.684997 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.686886 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233"} Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.687125 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.688044 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.688083 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.688104 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.711024 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.712482 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.712528 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.712546 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:51 crc kubenswrapper[4738]: I0307 07:00:51.712578 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:51 crc kubenswrapper[4738]: E0307 07:00:51.717116 4738 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 07 07:00:51 crc kubenswrapper[4738]: E0307 07:00:51.717291 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 07:00:52 crc kubenswrapper[4738]: I0307 07:00:52.326519 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:52 crc kubenswrapper[4738]: E0307 07:00:52.488042 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:00:52 crc kubenswrapper[4738]: I0307 07:00:52.632047 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 07:00:52 crc kubenswrapper[4738]: I0307 07:00:52.659858 4738 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.325391 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.405007 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.405299 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.406557 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.406595 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.406603 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.409015 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.698563 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.699546 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.702690 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" exitCode=255 Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.702757 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233"} Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.702885 4738 scope.go:117] "RemoveContainer" containerID="4ee163bd2e3fc39c6a634e57c423513642f860e9e121f413bc8eceaeac7006d5" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.703316 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.703595 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.705195 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.705594 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.706490 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.706646 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.706573 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.707425 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:53 crc kubenswrapper[4738]: I0307 07:00:53.707839 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:00:53 crc kubenswrapper[4738]: E0307 07:00:53.708294 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.300412 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.328406 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.706649 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.708916 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.709867 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.709941 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.709957 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:54 crc kubenswrapper[4738]: I0307 07:00:54.710641 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:00:54 crc kubenswrapper[4738]: E0307 07:00:54.710842 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:54 crc kubenswrapper[4738]: W0307 07:00:54.721910 4738 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 07 07:00:54 crc kubenswrapper[4738]: E0307 07:00:54.721957 4738 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 07:00:55 crc kubenswrapper[4738]: I0307 07:00:55.326241 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:56 crc kubenswrapper[4738]: I0307 07:00:56.327094 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.100814 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.101098 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.102872 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.102944 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.102960 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.103667 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:00:57 crc kubenswrapper[4738]: E0307 07:00:57.103854 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.326009 4738 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.385613 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.387028 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.387328 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.387411 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.436465 4738 csr.go:261] certificate signing request csr-vqddf is approved, waiting to be issued Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.444931 4738 csr.go:257] certificate signing request csr-vqddf is issued Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.531877 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.532040 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.533404 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.533450 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.533466 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:57 crc kubenswrapper[4738]: I0307 07:00:57.540923 4738 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.166381 4738 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.446664 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-11 13:56:37.590694818 +0000 UTC Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.446766 4738 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6702h55m39.143941019s for next certificate rotation Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.717693 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.718982 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.719069 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.719093 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.719342 4738 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.729494 4738 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.729940 4738 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.729981 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.735628 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.735679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.735697 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.735726 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.735752 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:00:58Z","lastTransitionTime":"2026-03-07T07:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.759677 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.771140 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.771236 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.771255 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.771281 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.771301 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:00:58Z","lastTransitionTime":"2026-03-07T07:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.789297 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.800972 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.801031 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.801052 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.801078 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.801097 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:00:58Z","lastTransitionTime":"2026-03-07T07:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.819461 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.831396 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.831502 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.831532 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.831566 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:00:58 crc kubenswrapper[4738]: I0307 07:00:58.831592 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:00:58Z","lastTransitionTime":"2026-03-07T07:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.872956 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.873244 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.873284 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:58 crc kubenswrapper[4738]: E0307 07:00:58.974080 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.075100 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.175887 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.276451 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.376802 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.477659 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.578148 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.678946 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.779366 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.879818 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:00:59 crc kubenswrapper[4738]: E0307 07:00:59.980945 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.081887 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.182790 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.282942 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.383691 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.484721 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.585934 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.686672 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.787071 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.888119 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:00 crc kubenswrapper[4738]: E0307 07:01:00.988888 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.089867 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.190380 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: I0307 07:01:01.210346 4738 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.291470 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.392091 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.492749 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.593753 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.693923 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.794982 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.896287 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:01 crc kubenswrapper[4738]: E0307 07:01:01.997252 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.097771 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.198950 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.299914 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.400726 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.488556 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.501883 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.602099 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.703198 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.803405 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:02 crc kubenswrapper[4738]: E0307 07:01:02.904242 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.004937 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.105884 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: I0307 07:01:03.126102 4738 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.206228 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.307054 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.407375 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.508146 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.608642 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.709813 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.810791 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:03 crc kubenswrapper[4738]: E0307 07:01:03.911479 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.012394 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.113409 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.213880 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.314958 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.415920 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.516824 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.617779 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.718857 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.819498 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:04 crc kubenswrapper[4738]: E0307 07:01:04.920141 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.021367 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.122092 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.222577 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.323526 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.424683 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.525698 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.625867 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.726616 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.826809 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:05 crc kubenswrapper[4738]: E0307 07:01:05.927629 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.028248 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.128985 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.230143 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.330948 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.431780 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.532466 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.632815 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.733817 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.834917 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:06 crc kubenswrapper[4738]: E0307 07:01:06.935815 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.036677 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.137323 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.238028 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.339229 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.440353 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.541224 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.642363 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.743462 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.843678 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:07 crc kubenswrapper[4738]: E0307 07:01:07.944660 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.045439 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.145937 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.246262 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.347219 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.448340 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.549427 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.650579 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.751280 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.852002 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:08 crc kubenswrapper[4738]: E0307 07:01:08.953209 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.053950 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.154989 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.222628 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.227993 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.228056 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.228068 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.228088 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.228101 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:09Z","lastTransitionTime":"2026-03-07T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.239554 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.243542 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.243693 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.243779 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.243877 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.243958 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:09Z","lastTransitionTime":"2026-03-07T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.254528 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.258477 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.258546 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.258561 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.258581 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.258595 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:09Z","lastTransitionTime":"2026-03-07T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.272001 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.276970 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.277057 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.277102 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.277129 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:09 crc kubenswrapper[4738]: I0307 07:01:09.277147 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:09Z","lastTransitionTime":"2026-03-07T07:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.296061 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.296440 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.296497 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.396867 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.497072 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.597644 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.698176 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.798686 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:09 crc kubenswrapper[4738]: E0307 07:01:09.899881 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.000497 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.101436 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.201889 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.302589 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.403062 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.504008 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.604540 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.705241 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.805410 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:10 crc kubenswrapper[4738]: E0307 07:01:10.906060 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.006560 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.106955 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.207952 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.308324 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: I0307 07:01:11.385309 4738 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 07:01:11 crc kubenswrapper[4738]: I0307 07:01:11.386897 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:11 crc kubenswrapper[4738]: I0307 07:01:11.386983 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:11 crc kubenswrapper[4738]: I0307 07:01:11.387002 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:11 crc kubenswrapper[4738]: I0307 07:01:11.387925 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.388175 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.408959 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.509597 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.610002 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.710719 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.810881 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:11 crc kubenswrapper[4738]: E0307 07:01:11.911755 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.011937 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.112956 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.214020 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.314939 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.416172 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.489015 4738 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.516265 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.616477 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.716806 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.817866 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:12 crc kubenswrapper[4738]: E0307 07:01:12.918207 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.018891 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.119897 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.220322 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.321191 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.421728 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.522139 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.623176 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.723502 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.824512 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:13 crc kubenswrapper[4738]: E0307 07:01:13.925400 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.026318 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.127400 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.228533 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.329582 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.430609 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.531453 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.632536 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.733331 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.834073 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:14 crc kubenswrapper[4738]: E0307 07:01:14.935271 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.035756 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.135982 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.236438 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.337298 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.437504 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.538464 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.638696 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.739849 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.839969 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:15 crc kubenswrapper[4738]: E0307 07:01:15.940920 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.041875 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.142252 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.243394 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.343771 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.444151 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.544726 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: E0307 07:01:16.644820 4738 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.667036 4738 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.747965 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.748031 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.748049 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.748075 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.748101 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:16Z","lastTransitionTime":"2026-03-07T07:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.851571 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.851641 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.851664 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.851693 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.851711 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:16Z","lastTransitionTime":"2026-03-07T07:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.954428 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.954469 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.954479 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.954496 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:16 crc kubenswrapper[4738]: I0307 07:01:16.954505 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:16Z","lastTransitionTime":"2026-03-07T07:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.056907 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.056941 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.056950 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.056966 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.056978 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.159506 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.159581 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.159595 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.159613 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.159625 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.262227 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.262267 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.262280 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.262301 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.262318 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.365807 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.365842 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.365850 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.365866 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.365875 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.478762 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.478799 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.478807 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.478823 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.478836 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.580260 4738 apiserver.go:52] "Watching apiserver" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.583143 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.583231 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.583246 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.583268 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.583282 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.587561 4738 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.587938 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c"] Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.588531 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.588620 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.588687 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.588973 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.589024 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.588621 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.588989 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.589094 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.589441 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.592654 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.594531 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.594972 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.595480 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.595858 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.596097 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.596391 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.596753 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.596966 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.622267 4738 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.638086 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.656479 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.672052 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.687128 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.687179 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.687191 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.687211 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.687223 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.690504 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.703987 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704101 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704203 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704250 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704291 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704326 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704362 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704383 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704411 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704449 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704485 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704520 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704556 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704589 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704626 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704653 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704666 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704674 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704702 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704740 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704781 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704815 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704844 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704850 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704873 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704904 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704933 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704950 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704960 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.704969 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705041 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705079 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705102 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705223 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705394 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705821 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705983 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.705979 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706050 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706202 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706280 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706553 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706583 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706643 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706831 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.706940 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707195 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707365 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707460 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707521 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707582 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707631 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707672 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707692 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707709 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707904 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.707966 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708025 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708078 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708133 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708225 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708283 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708297 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708333 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708389 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708449 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708485 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708504 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708554 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708607 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708657 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708731 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708786 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708840 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708892 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708946 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709058 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709111 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709197 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709254 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709313 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709368 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709425 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709477 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709533 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709592 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709643 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709695 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709747 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709799 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709849 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709900 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709952 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710000 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710053 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710106 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710244 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710301 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710352 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710404 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710458 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710507 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710562 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710611 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710658 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710707 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710760 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710812 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708729 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708747 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708759 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.708969 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709111 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714007 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709120 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709333 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709477 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709512 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.709902 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710058 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714152 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710124 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710554 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710716 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.710882 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.711260 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.711287 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.711779 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.711843 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.712235 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.712509 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.712637 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.712878 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.713069 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.713400 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.713222 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714483 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.713648 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.713872 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714022 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714250 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714439 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.714657 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.715351 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.715479 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.715680 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.715738 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.715873 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716168 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716281 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716541 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716583 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716660 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716623 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716730 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716723 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716804 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716867 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716924 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.716979 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717032 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717046 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717079 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717095 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717134 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717234 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717286 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717340 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717393 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717448 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717499 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717550 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717607 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717665 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.717772 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.718851 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.718980 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719018 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719229 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719185 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719298 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719341 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719381 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719418 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719455 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719502 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719554 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719608 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719660 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719700 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719748 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719802 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719862 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719920 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720051 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720092 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720222 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720282 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720338 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720379 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720422 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720472 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720513 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720549 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720592 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720636 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720685 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720744 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720799 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720852 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720916 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720969 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721021 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721073 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721129 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721272 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721349 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721456 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719450 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.719880 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720053 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720815 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.720974 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721115 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721604 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721648 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.722412 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.722571 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.722698 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.723145 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.722941 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.723751 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.723784 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.723810 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724146 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724481 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724503 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724523 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724556 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724601 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724738 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.724979 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725068 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725271 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725543 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725538 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725699 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725874 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.725900 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726432 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726468 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.721508 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726567 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726776 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726826 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726872 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726910 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726945 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.726981 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727035 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727073 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727112 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727147 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727208 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727135 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727304 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727460 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727514 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727822 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.727868 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728038 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728055 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728066 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728078 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728088 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728292 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728340 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728376 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728413 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728449 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728553 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728597 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728644 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728687 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728732 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728772 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.728888 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:18.228773393 +0000 UTC m=+96.693760914 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.728994 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.729284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.729581 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.730124 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.730108 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.730463 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731004 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731004 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731700 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731113 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731505 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731844 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731622 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731905 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.731961 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732014 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732070 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732132 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732245 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732303 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732367 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732420 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732477 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732536 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732591 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732644 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732699 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732753 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732807 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732870 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732930 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.732992 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733054 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733111 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733298 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733364 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733460 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733518 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733558 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733601 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733655 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733711 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733767 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733832 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733891 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.733944 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734006 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734070 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734107 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734237 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734372 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734406 4738 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734437 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734470 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734499 4738 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734528 4738 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734556 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734585 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734613 4738 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734634 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734655 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734685 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734713 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734741 4738 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734770 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734799 4738 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734828 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734855 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734881 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734914 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734942 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.734973 4738 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735001 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735027 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735053 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735077 4738 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.735066 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735103 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735518 4738 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.735668 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.736292 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.736629 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:18.235413807 +0000 UTC m=+96.700401158 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.736801 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.736847 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.736877 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.736918 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:18.236893491 +0000 UTC m=+96.701881082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.736950 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.736980 4738 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737010 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737033 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737055 4738 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737076 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737105 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737132 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737215 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737237 4738 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737257 4738 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737277 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737300 4738 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737320 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737340 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737359 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737363 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737381 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737403 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737445 4738 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737479 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737506 4738 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737531 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737559 4738 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737588 4738 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737635 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737662 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737693 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737720 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737746 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737772 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737798 4738 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737824 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737854 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737880 4738 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737908 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737939 4738 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737965 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.737991 4738 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738018 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738047 4738 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738074 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738100 4738 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738223 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738269 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738297 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738326 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738358 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738385 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738412 4738 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738439 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738464 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738490 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738518 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738543 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738569 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738595 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738622 4738 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738647 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738673 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738699 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738724 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738749 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738773 4738 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738798 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738824 4738 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738851 4738 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738878 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738909 4738 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738934 4738 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738958 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.738985 4738 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.739011 4738 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.739036 4738 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.739553 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.739591 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.739935 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740052 4738 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740134 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740189 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740221 4738 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740250 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740280 4738 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740312 4738 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740341 4738 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740367 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740395 4738 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740425 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740451 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740477 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740502 4738 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740529 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740555 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740580 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740607 4738 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740634 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740660 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740684 4738 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.740708 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.741371 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.741843 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.741953 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.742034 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.744951 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.746272 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.746449 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.746135 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.748838 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.754840 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.755425 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.760327 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.760378 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.760400 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.760491 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:18.260467633 +0000 UTC m=+96.725454984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.765809 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.767023 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.768251 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.773278 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.773537 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.773776 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.774335 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:18.274098154 +0000 UTC m=+96.739085475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.776463 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.776576 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777177 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777143 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777441 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777641 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777780 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777849 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.777862 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.778645 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779484 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779582 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779707 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779781 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779729 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779874 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779880 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.779966 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780191 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780342 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780395 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780742 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780825 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.780933 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.781233 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.781436 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.781878 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.781908 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.781928 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.782303 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.782542 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.782879 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783292 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783454 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783466 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783522 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783654 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783053 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783944 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.783998 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784133 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784256 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784321 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784343 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784481 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784625 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784666 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784766 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.784918 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.786510 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.786966 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.792354 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.787592 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.787683 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.787744 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.788515 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.792436 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.792497 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.792519 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.792533 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.788737 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.788623 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.797647 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.801688 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.813195 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.819472 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842619 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842704 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842808 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842820 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842832 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842916 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842919 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842964 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842980 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.842990 4738 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843007 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843019 4738 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843029 4738 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843043 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843040 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843052 4738 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843121 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843138 4738 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843199 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843217 4738 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843233 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843254 4738 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843270 4738 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843284 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843303 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843323 4738 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843341 4738 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843356 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843371 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843393 4738 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843412 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843427 4738 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843440 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843457 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843470 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843484 4738 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843507 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843522 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843539 4738 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843554 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843575 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843591 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843608 4738 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843622 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843639 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843655 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843669 4738 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843688 4738 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843701 4738 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843718 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843735 4738 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843754 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843770 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843786 4738 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843800 4738 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843819 4738 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843833 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843847 4738 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843861 4738 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843880 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843895 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843910 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843929 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843943 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843958 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843974 4738 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.843997 4738 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.844011 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.844026 4738 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.844043 4738 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.894953 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.894999 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.895013 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.895034 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.895050 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.918244 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.933527 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.934665 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:17 crc kubenswrapper[4738]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:17 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:17 crc kubenswrapper[4738]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 07 07:01:17 crc kubenswrapper[4738]: source /etc/kubernetes/apiserver-url.env Mar 07 07:01:17 crc kubenswrapper[4738]: else Mar 07 07:01:17 crc kubenswrapper[4738]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 07 07:01:17 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:17 crc kubenswrapper[4738]: fi Mar 07 07:01:17 crc kubenswrapper[4738]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 07 07:01:17 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:17 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.935867 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.945454 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.950738 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:17 crc kubenswrapper[4738]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:17 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:17 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:17 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:17 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:17 crc kubenswrapper[4738]: fi Mar 07 07:01:17 crc kubenswrapper[4738]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 07 07:01:17 crc kubenswrapper[4738]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 07 07:01:17 crc kubenswrapper[4738]: ho_enable="--enable-hybrid-overlay" Mar 07 07:01:17 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 07 07:01:17 crc kubenswrapper[4738]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 07 07:01:17 crc kubenswrapper[4738]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 07 07:01:17 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:17 crc kubenswrapper[4738]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 07 07:01:17 crc kubenswrapper[4738]: --webhook-host=127.0.0.1 \ Mar 07 07:01:17 crc kubenswrapper[4738]: --webhook-port=9743 \ Mar 07 07:01:17 crc kubenswrapper[4738]: ${ho_enable} \ Mar 07 07:01:17 crc kubenswrapper[4738]: --enable-interconnect \ Mar 07 07:01:17 crc kubenswrapper[4738]: --disable-approver \ Mar 07 07:01:17 crc kubenswrapper[4738]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 07 07:01:17 crc kubenswrapper[4738]: --wait-for-kubernetes-api=200s \ Mar 07 07:01:17 crc kubenswrapper[4738]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 07 07:01:17 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:17 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:17 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.953378 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:17 crc kubenswrapper[4738]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:17 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:17 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:17 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:17 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:17 crc kubenswrapper[4738]: fi Mar 07 07:01:17 crc kubenswrapper[4738]: Mar 07 07:01:17 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 07 07:01:17 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:17 crc kubenswrapper[4738]: --disable-webhook \ Mar 07 07:01:17 crc kubenswrapper[4738]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 07 07:01:17 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:17 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:17 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.954636 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 07 07:01:17 crc kubenswrapper[4738]: W0307 07:01:17.955791 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-c16597d036af9166496a830dbcc91d61222c5c650ac5928f62fe7eeb5e8be80c WatchSource:0}: Error finding container c16597d036af9166496a830dbcc91d61222c5c650ac5928f62fe7eeb5e8be80c: Status 404 returned error can't find the container with id c16597d036af9166496a830dbcc91d61222c5c650ac5928f62fe7eeb5e8be80c Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.958092 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:17 crc kubenswrapper[4738]: E0307 07:01:17.959427 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.998315 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.998359 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.998371 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.998389 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:17 crc kubenswrapper[4738]: I0307 07:01:17.998403 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:17Z","lastTransitionTime":"2026-03-07T07:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.101363 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.101445 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.101459 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.101476 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.101488 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.203638 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.203671 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.203679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.203694 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.203706 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.248018 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.248129 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.248184 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.248336 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.248361 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.248367 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:19.248346525 +0000 UTC m=+97.713333846 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.248390 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:19.248384186 +0000 UTC m=+97.713371507 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.248402 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:19.248396927 +0000 UTC m=+97.713384248 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.307267 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.307321 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.307334 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.307354 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.307367 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.349369 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.349416 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.349548 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.349566 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.349576 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.349629 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:19.34961493 +0000 UTC m=+97.814602251 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.350050 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.350071 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.350080 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.350102 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:19.350095564 +0000 UTC m=+97.815082885 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.388526 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.389111 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.390537 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.391170 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.392122 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.392671 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.393512 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.394500 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.395125 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.396043 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.396594 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.397638 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.398119 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.398672 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.399564 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.400074 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.401083 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.401519 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.402068 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.403099 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.403592 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.404714 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.405204 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.406227 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.406625 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.407230 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.408528 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.409041 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410044 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410543 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410767 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410856 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410883 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410912 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.410932 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.411480 4738 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.411600 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.413305 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.414246 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.414649 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.416208 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.417309 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.418414 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.419021 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.420117 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.421367 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.422349 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.422925 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.423923 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.424415 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.425319 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.425921 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.427315 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.427977 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.428822 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.429302 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.430518 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.431055 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.431518 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.513768 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.513816 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.513838 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.513860 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.513877 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.616682 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.616742 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.616758 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.616784 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.616800 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.719858 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.719907 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.719919 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.719938 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.719951 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.788481 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3aa34c477063f7f9032a4ee664aab31abd0d45ee3df79ca5ce5f45f1ec01cd2a"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.789781 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"980399e15b321d0cf638b06286feb72edbbd000a55e102fec0f47dfaad66ab32"} Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.792357 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:18 crc kubenswrapper[4738]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:18 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:18 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:18 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:18 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:18 crc kubenswrapper[4738]: fi Mar 07 07:01:18 crc kubenswrapper[4738]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 07 07:01:18 crc kubenswrapper[4738]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 07 07:01:18 crc kubenswrapper[4738]: ho_enable="--enable-hybrid-overlay" Mar 07 07:01:18 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 07 07:01:18 crc kubenswrapper[4738]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 07 07:01:18 crc kubenswrapper[4738]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 07 07:01:18 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:18 crc kubenswrapper[4738]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 07 07:01:18 crc kubenswrapper[4738]: --webhook-host=127.0.0.1 \ Mar 07 07:01:18 crc kubenswrapper[4738]: --webhook-port=9743 \ Mar 07 07:01:18 crc kubenswrapper[4738]: ${ho_enable} \ Mar 07 07:01:18 crc kubenswrapper[4738]: --enable-interconnect \ Mar 07 07:01:18 crc kubenswrapper[4738]: --disable-approver \ Mar 07 07:01:18 crc kubenswrapper[4738]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 07 07:01:18 crc kubenswrapper[4738]: --wait-for-kubernetes-api=200s \ Mar 07 07:01:18 crc kubenswrapper[4738]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 07 07:01:18 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:18 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:18 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.793312 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:18 crc kubenswrapper[4738]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:18 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:18 crc kubenswrapper[4738]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 07 07:01:18 crc kubenswrapper[4738]: source /etc/kubernetes/apiserver-url.env Mar 07 07:01:18 crc kubenswrapper[4738]: else Mar 07 07:01:18 crc kubenswrapper[4738]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 07 07:01:18 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:18 crc kubenswrapper[4738]: fi Mar 07 07:01:18 crc kubenswrapper[4738]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 07 07:01:18 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:18 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.794275 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c16597d036af9166496a830dbcc91d61222c5c650ac5928f62fe7eeb5e8be80c"} Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.794537 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.795691 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:18 crc kubenswrapper[4738]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:18 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:18 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:18 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:18 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:18 crc kubenswrapper[4738]: fi Mar 07 07:01:18 crc kubenswrapper[4738]: Mar 07 07:01:18 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 07 07:01:18 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:18 crc kubenswrapper[4738]: --disable-webhook \ Mar 07 07:01:18 crc kubenswrapper[4738]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 07 07:01:18 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:18 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:18 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.796326 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.797401 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 07 07:01:18 crc kubenswrapper[4738]: E0307 07:01:18.797456 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.807623 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.819402 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.822640 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.822688 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.822700 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.822719 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.822732 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.831967 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.844106 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.856786 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.873018 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.887934 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.903224 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.917404 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.925771 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.925811 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.925820 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.925837 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.925848 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:18Z","lastTransitionTime":"2026-03-07T07:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.933330 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.948730 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:18 crc kubenswrapper[4738]: I0307 07:01:18.962692 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.029339 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.029426 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.029449 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.029483 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.029506 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.132729 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.132933 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.132955 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.132981 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.132997 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.237044 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.237124 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.237146 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.237203 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.237223 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.258910 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.259073 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.259115 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:21.259070584 +0000 UTC m=+99.724057965 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.259325 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.259379 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.259447 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.259521 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:21.259495578 +0000 UTC m=+99.724482939 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.259558 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:21.259540139 +0000 UTC m=+99.724527500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.340903 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.340999 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.341026 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.341054 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.341072 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.360552 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.360624 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.360835 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.360885 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.360909 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.360846 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.361025 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.360988 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:21.360961568 +0000 UTC m=+99.825948929 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.361101 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.361339 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:21.361251096 +0000 UTC m=+99.826238637 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.385138 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.385207 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.385260 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.385431 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.385863 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.385718 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.439826 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.439901 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.439920 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.439948 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.439969 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.455899 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.460566 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.460638 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.460654 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.460681 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.460695 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.475786 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.479867 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.479944 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.479966 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.479995 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.480016 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.492340 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.497036 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.497108 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.497130 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.497188 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.497209 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.511283 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.514732 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.514770 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.514785 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.514804 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.514820 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.530185 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:19 crc kubenswrapper[4738]: E0307 07:01:19.530335 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.532483 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.532548 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.532575 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.532607 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.532626 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.635271 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.635322 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.635341 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.635360 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.635371 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.738887 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.738946 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.738965 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.738988 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.739011 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.841509 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.841569 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.841586 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.841615 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.841634 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.944918 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.945011 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.945031 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.945059 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:19 crc kubenswrapper[4738]: I0307 07:01:19.945079 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:19Z","lastTransitionTime":"2026-03-07T07:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.048783 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.048856 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.048869 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.048890 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.048902 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.152229 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.152571 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.152657 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.152730 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.152793 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.255918 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.256391 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.256484 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.256600 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.256690 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.359416 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.359456 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.359472 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.359498 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.359512 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.462316 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.462693 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.462775 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.462875 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.462953 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.566027 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.566077 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.566094 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.566119 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.566136 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.668431 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.668485 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.668497 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.668515 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.668530 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.771377 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.771456 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.771477 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.771507 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.771535 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.874401 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.874468 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.874494 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.874525 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.874545 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.978049 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.978109 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.978126 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.978191 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:20 crc kubenswrapper[4738]: I0307 07:01:20.978210 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:20Z","lastTransitionTime":"2026-03-07T07:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.081736 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.081791 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.081808 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.081835 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.081855 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.185598 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.185677 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.185712 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.185746 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.185771 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.280267 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.280904 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:25.280846932 +0000 UTC m=+103.745834293 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.281715 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.281993 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.282521 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.282652 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:25.282609255 +0000 UTC m=+103.747596616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.283644 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.283768 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:25.283739517 +0000 UTC m=+103.748726888 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.290169 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.290227 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.290241 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.290261 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.290276 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.383928 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.384045 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384259 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384294 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384259 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384333 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384346 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384313 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384422 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:25.384401525 +0000 UTC m=+103.849388846 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.384455 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:25.384440136 +0000 UTC m=+103.849427477 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.384684 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.384801 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.384853 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.385031 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.385128 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:21 crc kubenswrapper[4738]: E0307 07:01:21.385352 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.394624 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.394691 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.394715 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.394743 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.394763 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.410288 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.499301 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.499384 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.499405 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.499440 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.499462 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.603867 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.603955 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.603975 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.604003 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.604030 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.708199 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.708275 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.708296 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.708328 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.708352 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.811524 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.811573 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.811588 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.811606 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.811618 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.914476 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.914540 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.914553 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.914577 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:21 crc kubenswrapper[4738]: I0307 07:01:21.914592 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:21Z","lastTransitionTime":"2026-03-07T07:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.017735 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.017806 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.017824 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.017851 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.017873 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.120992 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.121081 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.121100 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.121127 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.121146 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.224230 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.224317 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.224335 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.224367 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.224395 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.328346 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.328493 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.328522 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.328557 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.328578 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.399353 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.412323 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.426597 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.432017 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.432081 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.432097 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.432135 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.432149 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.439454 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.459567 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.473236 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.515532 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.535108 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.535148 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.535179 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.535199 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.535211 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.639671 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.639753 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.639776 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.639808 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.639831 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.743405 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.743484 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.743510 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.743541 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.743563 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.846715 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.846798 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.846820 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.846849 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.846870 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.950024 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.950101 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.950125 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.950196 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:22 crc kubenswrapper[4738]: I0307 07:01:22.950228 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:22Z","lastTransitionTime":"2026-03-07T07:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.053309 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.053397 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.053421 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.053458 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.053482 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.158187 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.158260 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.158275 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.158303 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.158326 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.261824 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.261904 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.261923 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.261953 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.261974 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.364792 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.364896 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.364916 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.364945 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.364965 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.385664 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.385760 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.385676 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:23 crc kubenswrapper[4738]: E0307 07:01:23.385828 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:23 crc kubenswrapper[4738]: E0307 07:01:23.386002 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:23 crc kubenswrapper[4738]: E0307 07:01:23.386238 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.468583 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.468643 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.468655 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.468674 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.468691 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.572381 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.572429 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.572441 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.572459 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.572472 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.675309 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.675778 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.675889 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.675992 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.676076 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.779494 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.779560 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.779580 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.779607 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.779626 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.882232 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.882307 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.882330 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.882365 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.882388 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.985334 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.985393 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.985410 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.985432 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:23 crc kubenswrapper[4738]: I0307 07:01:23.985451 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:23Z","lastTransitionTime":"2026-03-07T07:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.088938 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.089446 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.089504 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.089549 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.089604 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.192679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.192733 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.192745 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.192764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.192777 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.295961 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.296016 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.296028 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.296051 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.296063 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.399430 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.399506 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.399526 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.399552 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.399573 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.502392 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.502436 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.502445 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.502462 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.502472 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.605816 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.605893 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.605912 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.605949 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.605970 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.709923 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.710011 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.710033 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.710064 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.710086 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.813881 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.813918 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.813926 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.813944 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.813956 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.917932 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.918002 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.918089 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.918902 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:24 crc kubenswrapper[4738]: I0307 07:01:24.918939 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:24Z","lastTransitionTime":"2026-03-07T07:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.022736 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.022778 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.022804 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.022821 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.022830 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.126462 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.126569 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.126589 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.126626 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.126681 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.230314 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.230449 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.230470 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.230495 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.230513 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.322941 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.323055 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.323136 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.323300 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.323420 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:33.323385679 +0000 UTC m=+111.788373040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.323483 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:33.323456961 +0000 UTC m=+111.788444322 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.323686 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.323857 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:33.323815912 +0000 UTC m=+111.788803273 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.333634 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.333699 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.333721 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.333746 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.333764 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.384726 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.385000 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.385071 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.385596 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.385763 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.385920 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.400728 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.401095 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.401503 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.424013 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.424090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424322 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424352 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424371 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424407 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424442 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424447 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:33.424425427 +0000 UTC m=+111.889412778 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424462 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.424547 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:33.42451425 +0000 UTC m=+111.889501611 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.437129 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.437516 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.437611 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.437736 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.437833 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.541271 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.541309 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.541317 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.541333 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.541342 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.644598 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.644671 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.644692 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.644719 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.644737 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.747643 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.747706 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.747717 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.747734 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.747746 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.817018 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:01:25 crc kubenswrapper[4738]: E0307 07:01:25.817584 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.851368 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.851703 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.851882 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.851993 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.852088 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.956431 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.957527 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.957689 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.957848 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:25 crc kubenswrapper[4738]: I0307 07:01:25.958023 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:25Z","lastTransitionTime":"2026-03-07T07:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.062462 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.062831 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.062925 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.063024 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.063120 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.166811 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.167150 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.167238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.167338 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.167399 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.257235 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-mbtvs"] Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.258057 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.260636 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.261443 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.261863 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.270081 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.270485 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.270626 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.270746 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.270855 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.276802 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.294986 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.313627 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.326484 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.333494 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ed4db713-ac09-4c8e-ab4c-f9031c78d476-hosts-file\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.333552 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zllrn\" (UniqueName: \"kubernetes.io/projected/ed4db713-ac09-4c8e-ab4c-f9031c78d476-kube-api-access-zllrn\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.352390 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.365561 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.374264 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.374292 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.374300 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.374319 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.374330 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.379926 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.392749 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.407015 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.434707 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ed4db713-ac09-4c8e-ab4c-f9031c78d476-hosts-file\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.434777 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zllrn\" (UniqueName: \"kubernetes.io/projected/ed4db713-ac09-4c8e-ab4c-f9031c78d476-kube-api-access-zllrn\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.434974 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/ed4db713-ac09-4c8e-ab4c-f9031c78d476-hosts-file\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.455202 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zllrn\" (UniqueName: \"kubernetes.io/projected/ed4db713-ac09-4c8e-ab4c-f9031c78d476-kube-api-access-zllrn\") pod \"node-resolver-mbtvs\" (UID: \"ed4db713-ac09-4c8e-ab4c-f9031c78d476\") " pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.477467 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.477527 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.477536 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.477554 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.477567 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.580615 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-mbtvs" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.581551 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.581607 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.581616 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.581636 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.581650 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.599584 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:26 crc kubenswrapper[4738]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:26 crc kubenswrapper[4738]: set -uo pipefail Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 07 07:01:26 crc kubenswrapper[4738]: HOSTS_FILE="/etc/hosts" Mar 07 07:01:26 crc kubenswrapper[4738]: TEMP_FILE="/etc/hosts.tmp" Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Make a temporary file with the old hosts file's attributes. Mar 07 07:01:26 crc kubenswrapper[4738]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 07 07:01:26 crc kubenswrapper[4738]: echo "Failed to preserve hosts file. Exiting." Mar 07 07:01:26 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: while true; do Mar 07 07:01:26 crc kubenswrapper[4738]: declare -A svc_ips Mar 07 07:01:26 crc kubenswrapper[4738]: for svc in "${services[@]}"; do Mar 07 07:01:26 crc kubenswrapper[4738]: # Fetch service IP from cluster dns if present. We make several tries Mar 07 07:01:26 crc kubenswrapper[4738]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 07 07:01:26 crc kubenswrapper[4738]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 07 07:01:26 crc kubenswrapper[4738]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 07 07:01:26 crc kubenswrapper[4738]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 07 07:01:26 crc kubenswrapper[4738]: for i in ${!cmds[*]} Mar 07 07:01:26 crc kubenswrapper[4738]: do Mar 07 07:01:26 crc kubenswrapper[4738]: ips=($(eval "${cmds[i]}")) Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: svc_ips["${svc}"]="${ips[@]}" Mar 07 07:01:26 crc kubenswrapper[4738]: break Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Update /etc/hosts only if we get valid service IPs Mar 07 07:01:26 crc kubenswrapper[4738]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 07 07:01:26 crc kubenswrapper[4738]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 07 07:01:26 crc kubenswrapper[4738]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 07 07:01:26 crc kubenswrapper[4738]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: continue Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Append resolver entries for services Mar 07 07:01:26 crc kubenswrapper[4738]: rc=0 Mar 07 07:01:26 crc kubenswrapper[4738]: for svc in "${!svc_ips[@]}"; do Mar 07 07:01:26 crc kubenswrapper[4738]: for ip in ${svc_ips[${svc}]}; do Mar 07 07:01:26 crc kubenswrapper[4738]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ $rc -ne 0 ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: continue Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 07 07:01:26 crc kubenswrapper[4738]: # Replace /etc/hosts with our modified version if needed Mar 07 07:01:26 crc kubenswrapper[4738]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 07 07:01:26 crc kubenswrapper[4738]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: unset svc_ips Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zllrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-mbtvs_openshift-dns(ed4db713-ac09-4c8e-ab4c-f9031c78d476): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:26 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.601108 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-mbtvs" podUID="ed4db713-ac09-4c8e-ab4c-f9031c78d476" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.630219 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-54cnw"] Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.630920 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-fmp5z"] Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.631735 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-t7vcc"] Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.632218 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.632728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.633296 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.635079 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.635582 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.635716 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.637389 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.637616 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.637908 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.638024 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.638142 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.638303 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.639063 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.640343 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.640753 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.675856 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.684050 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.684103 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.684112 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.684133 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.684145 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.696427 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.718733 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.732322 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736402 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwsww\" (UniqueName: \"kubernetes.io/projected/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-kube-api-access-fwsww\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736450 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-os-release\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736490 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736516 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-rootfs\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736541 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-bin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736556 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-k8s-cni-cncf-io\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736570 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-multus\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736592 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736646 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-daemon-config\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736684 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736702 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736725 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-netns\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736747 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-multus-certs\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736770 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn2s9\" (UniqueName: \"kubernetes.io/projected/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-kube-api-access-zn2s9\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736790 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-proxy-tls\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736812 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cnibin\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736840 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-system-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736863 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-cnibin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736891 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-cni-binary-copy\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736931 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-kubelet\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.736981 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-622rm\" (UniqueName: \"kubernetes.io/projected/c0a91659-d53f-4694-82a7-8c66445ab4f5-kube-api-access-622rm\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737006 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-hostroot\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737022 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-etc-kubernetes\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737039 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-socket-dir-parent\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737064 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-conf-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737082 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737098 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-os-release\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.737113 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.747127 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.758632 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.769876 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.780121 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.787217 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.787264 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.787277 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.787298 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.787312 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.795963 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.811889 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.820226 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mbtvs" event={"ID":"ed4db713-ac09-4c8e-ab4c-f9031c78d476","Type":"ContainerStarted","Data":"352bd11e7cceb93648192a90efe82151134aba216592bbbe24976c9c679bc344"} Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.822140 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:26 crc kubenswrapper[4738]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:26 crc kubenswrapper[4738]: set -uo pipefail Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 07 07:01:26 crc kubenswrapper[4738]: HOSTS_FILE="/etc/hosts" Mar 07 07:01:26 crc kubenswrapper[4738]: TEMP_FILE="/etc/hosts.tmp" Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Make a temporary file with the old hosts file's attributes. Mar 07 07:01:26 crc kubenswrapper[4738]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 07 07:01:26 crc kubenswrapper[4738]: echo "Failed to preserve hosts file. Exiting." Mar 07 07:01:26 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: while true; do Mar 07 07:01:26 crc kubenswrapper[4738]: declare -A svc_ips Mar 07 07:01:26 crc kubenswrapper[4738]: for svc in "${services[@]}"; do Mar 07 07:01:26 crc kubenswrapper[4738]: # Fetch service IP from cluster dns if present. We make several tries Mar 07 07:01:26 crc kubenswrapper[4738]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 07 07:01:26 crc kubenswrapper[4738]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 07 07:01:26 crc kubenswrapper[4738]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 07 07:01:26 crc kubenswrapper[4738]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:26 crc kubenswrapper[4738]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 07 07:01:26 crc kubenswrapper[4738]: for i in ${!cmds[*]} Mar 07 07:01:26 crc kubenswrapper[4738]: do Mar 07 07:01:26 crc kubenswrapper[4738]: ips=($(eval "${cmds[i]}")) Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: svc_ips["${svc}"]="${ips[@]}" Mar 07 07:01:26 crc kubenswrapper[4738]: break Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Update /etc/hosts only if we get valid service IPs Mar 07 07:01:26 crc kubenswrapper[4738]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 07 07:01:26 crc kubenswrapper[4738]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 07 07:01:26 crc kubenswrapper[4738]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 07 07:01:26 crc kubenswrapper[4738]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: continue Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # Append resolver entries for services Mar 07 07:01:26 crc kubenswrapper[4738]: rc=0 Mar 07 07:01:26 crc kubenswrapper[4738]: for svc in "${!svc_ips[@]}"; do Mar 07 07:01:26 crc kubenswrapper[4738]: for ip in ${svc_ips[${svc}]}; do Mar 07 07:01:26 crc kubenswrapper[4738]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: if [[ $rc -ne 0 ]]; then Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: continue Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: Mar 07 07:01:26 crc kubenswrapper[4738]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 07 07:01:26 crc kubenswrapper[4738]: # Replace /etc/hosts with our modified version if needed Mar 07 07:01:26 crc kubenswrapper[4738]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 07 07:01:26 crc kubenswrapper[4738]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 07 07:01:26 crc kubenswrapper[4738]: fi Mar 07 07:01:26 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:26 crc kubenswrapper[4738]: unset svc_ips Mar 07 07:01:26 crc kubenswrapper[4738]: done Mar 07 07:01:26 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zllrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-mbtvs_openshift-dns(ed4db713-ac09-4c8e-ab4c-f9031c78d476): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:26 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.823444 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-mbtvs" podUID="ed4db713-ac09-4c8e-ab4c-f9031c78d476" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.824652 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-daemon-config\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838662 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838692 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-multus-certs\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838717 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn2s9\" (UniqueName: \"kubernetes.io/projected/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-kube-api-access-zn2s9\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838746 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-proxy-tls\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838770 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838793 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-netns\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838826 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cnibin\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838851 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-system-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838877 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-cnibin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838869 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-multus-certs\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.838900 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-cni-binary-copy\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839083 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-kubelet\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839141 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-hostroot\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839232 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-etc-kubernetes\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839284 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-622rm\" (UniqueName: \"kubernetes.io/projected/c0a91659-d53f-4694-82a7-8c66445ab4f5-kube-api-access-622rm\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839345 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-socket-dir-parent\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839403 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-netns\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839476 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-conf-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839493 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-system-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839519 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-hostroot\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839574 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-socket-dir-parent\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839597 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cnibin\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839638 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-kubelet\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839664 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-cnibin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839423 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-conf-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839714 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-etc-kubernetes\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839724 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839773 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-cni-dir\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839801 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-os-release\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839854 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839866 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839897 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-os-release\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.839974 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwsww\" (UniqueName: \"kubernetes.io/projected/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-kube-api-access-fwsww\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840012 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-os-release\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840051 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-rootfs\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840088 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-bin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840126 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840167 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-k8s-cni-cncf-io\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840228 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-multus\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840263 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840286 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-bin\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840356 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-run-k8s-cni-cncf-io\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840356 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-rootfs\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840399 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c0a91659-d53f-4694-82a7-8c66445ab4f5-host-var-lib-cni-multus\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840451 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-os-release\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-system-cni-dir\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840468 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-cni-binary-copy\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.840931 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-mcd-auth-proxy-config\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.841372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-binary-copy\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.841951 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c0a91659-d53f-4694-82a7-8c66445ab4f5-multus-daemon-config\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.842598 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.843424 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.851983 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-proxy-tls\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.855550 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.864165 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwsww\" (UniqueName: \"kubernetes.io/projected/0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7-kube-api-access-fwsww\") pod \"machine-config-daemon-t7vcc\" (UID: \"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\") " pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.864821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-622rm\" (UniqueName: \"kubernetes.io/projected/c0a91659-d53f-4694-82a7-8c66445ab4f5-kube-api-access-622rm\") pod \"multus-54cnw\" (UID: \"c0a91659-d53f-4694-82a7-8c66445ab4f5\") " pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.870647 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn2s9\" (UniqueName: \"kubernetes.io/projected/3c725cf7-39ed-4a27-abf6-8e8346cc6ebd-kube-api-access-zn2s9\") pod \"multus-additional-cni-plugins-fmp5z\" (UID: \"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\") " pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.876036 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.890941 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.891139 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.891324 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.891468 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.891593 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.889938 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.904665 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.917883 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.929062 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.944458 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.956751 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.959552 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.969008 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-54cnw" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.971348 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.974949 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.976113 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.976382 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.977249 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.986974 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:26 crc kubenswrapper[4738]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 07 07:01:26 crc kubenswrapper[4738]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 07 07:01:26 crc kubenswrapper[4738]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-622rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:26 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:26 crc kubenswrapper[4738]: E0307 07:01:26.988306 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.992445 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.994411 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.994447 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.994460 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.994506 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:26 crc kubenswrapper[4738]: I0307 07:01:26.994523 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:26Z","lastTransitionTime":"2026-03-07T07:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:26 crc kubenswrapper[4738]: W0307 07:01:26.996428 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c725cf7_39ed_4a27_abf6_8e8346cc6ebd.slice/crio-226c698451302de2a8202ad92410eb8d869406141f41c3af67878bbc9a3bbc55 WatchSource:0}: Error finding container 226c698451302de2a8202ad92410eb8d869406141f41c3af67878bbc9a3bbc55: Status 404 returned error can't find the container with id 226c698451302de2a8202ad92410eb8d869406141f41c3af67878bbc9a3bbc55 Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:26.999333 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zn2s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-fmp5z_openshift-multus(3c725cf7-39ed-4a27-abf6-8e8346cc6ebd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.008364 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" podUID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.013764 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh7s7"] Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.013908 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.015600 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.019980 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.020945 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.022043 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.022090 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.022587 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.022845 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.024814 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.029824 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.041250 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.051130 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.065052 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.083744 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.094313 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.097299 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.097325 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.097334 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.097349 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.097361 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.104362 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.124932 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.138372 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.143739 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.143870 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.143975 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v46jv\" (UniqueName: \"kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144100 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144236 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144368 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144431 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144459 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144489 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144514 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144618 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144738 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144778 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144868 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144913 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.144962 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.145004 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.145036 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.145086 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.145120 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.151151 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.167665 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.178770 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.197707 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.200818 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.200859 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.200870 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.200895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.200910 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246607 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246685 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246745 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246787 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246783 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246827 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246916 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.246959 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247030 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247064 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247108 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247142 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247217 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247252 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247285 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247332 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247362 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247394 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v46jv\" (UniqueName: \"kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247426 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247458 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247493 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247605 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.247653 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248056 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248209 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248234 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248285 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248290 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248327 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248357 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248331 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248392 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248388 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248451 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248251 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248513 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.248578 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.249845 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.251382 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.267488 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v46jv\" (UniqueName: \"kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv\") pod \"ovnkube-node-jh7s7\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.303764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.303810 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.303823 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.303843 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.303858 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.338864 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:27 crc kubenswrapper[4738]: W0307 07:01:27.354988 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e3f9734_9fb5_4b90_9268_888bc377406e.slice/crio-07e7e0f5700de1ac15b11facbe5e897760face580efdb4957e9a0b59babde84d WatchSource:0}: Error finding container 07e7e0f5700de1ac15b11facbe5e897760face580efdb4957e9a0b59babde84d: Status 404 returned error can't find the container with id 07e7e0f5700de1ac15b11facbe5e897760face580efdb4957e9a0b59babde84d Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.358240 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:27 crc kubenswrapper[4738]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 07 07:01:27 crc kubenswrapper[4738]: apiVersion: v1 Mar 07 07:01:27 crc kubenswrapper[4738]: clusters: Mar 07 07:01:27 crc kubenswrapper[4738]: - cluster: Mar 07 07:01:27 crc kubenswrapper[4738]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 07 07:01:27 crc kubenswrapper[4738]: server: https://api-int.crc.testing:6443 Mar 07 07:01:27 crc kubenswrapper[4738]: name: default-cluster Mar 07 07:01:27 crc kubenswrapper[4738]: contexts: Mar 07 07:01:27 crc kubenswrapper[4738]: - context: Mar 07 07:01:27 crc kubenswrapper[4738]: cluster: default-cluster Mar 07 07:01:27 crc kubenswrapper[4738]: namespace: default Mar 07 07:01:27 crc kubenswrapper[4738]: user: default-auth Mar 07 07:01:27 crc kubenswrapper[4738]: name: default-context Mar 07 07:01:27 crc kubenswrapper[4738]: current-context: default-context Mar 07 07:01:27 crc kubenswrapper[4738]: kind: Config Mar 07 07:01:27 crc kubenswrapper[4738]: preferences: {} Mar 07 07:01:27 crc kubenswrapper[4738]: users: Mar 07 07:01:27 crc kubenswrapper[4738]: - name: default-auth Mar 07 07:01:27 crc kubenswrapper[4738]: user: Mar 07 07:01:27 crc kubenswrapper[4738]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:27 crc kubenswrapper[4738]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:27 crc kubenswrapper[4738]: EOF Mar 07 07:01:27 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v46jv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:27 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.359560 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.385550 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.385704 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.385783 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.385550 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.385991 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.385889 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.406003 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.406045 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.406056 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.406076 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.406089 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.509052 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.509265 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.509279 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.509296 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.509309 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.611909 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.611980 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.611998 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.612029 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.612049 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.714834 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.714879 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.714890 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.714906 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.714931 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.817876 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.817934 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.817948 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.817975 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.817990 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.825002 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerStarted","Data":"5f78dd3c2da1b30e0c040a5edd57700d612d1703073b8ebeb323b2317294c4c3"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.826252 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"00f92c933a88ba2c9f790205f922b734b136b033234cff1b92e667b1b654d3c0"} Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.826695 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:27 crc kubenswrapper[4738]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 07 07:01:27 crc kubenswrapper[4738]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 07 07:01:27 crc kubenswrapper[4738]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-622rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:27 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.827765 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"07e7e0f5700de1ac15b11facbe5e897760face580efdb4957e9a0b59babde84d"} Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.827815 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.827903 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.829037 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:27 crc kubenswrapper[4738]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 07 07:01:27 crc kubenswrapper[4738]: apiVersion: v1 Mar 07 07:01:27 crc kubenswrapper[4738]: clusters: Mar 07 07:01:27 crc kubenswrapper[4738]: - cluster: Mar 07 07:01:27 crc kubenswrapper[4738]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 07 07:01:27 crc kubenswrapper[4738]: server: https://api-int.crc.testing:6443 Mar 07 07:01:27 crc kubenswrapper[4738]: name: default-cluster Mar 07 07:01:27 crc kubenswrapper[4738]: contexts: Mar 07 07:01:27 crc kubenswrapper[4738]: - context: Mar 07 07:01:27 crc kubenswrapper[4738]: cluster: default-cluster Mar 07 07:01:27 crc kubenswrapper[4738]: namespace: default Mar 07 07:01:27 crc kubenswrapper[4738]: user: default-auth Mar 07 07:01:27 crc kubenswrapper[4738]: name: default-context Mar 07 07:01:27 crc kubenswrapper[4738]: current-context: default-context Mar 07 07:01:27 crc kubenswrapper[4738]: kind: Config Mar 07 07:01:27 crc kubenswrapper[4738]: preferences: {} Mar 07 07:01:27 crc kubenswrapper[4738]: users: Mar 07 07:01:27 crc kubenswrapper[4738]: - name: default-auth Mar 07 07:01:27 crc kubenswrapper[4738]: user: Mar 07 07:01:27 crc kubenswrapper[4738]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:27 crc kubenswrapper[4738]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:27 crc kubenswrapper[4738]: EOF Mar 07 07:01:27 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v46jv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:27 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.829129 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerStarted","Data":"226c698451302de2a8202ad92410eb8d869406141f41c3af67878bbc9a3bbc55"} Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.830233 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.830518 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.831180 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zn2s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-fmp5z_openshift-multus(3c725cf7-39ed-4a27-abf6-8e8346cc6ebd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.832127 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:01:27 crc kubenswrapper[4738]: E0307 07:01:27.832241 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" podUID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.842744 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.857638 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.876320 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.888330 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.907312 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.922032 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.922102 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.922114 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.922136 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.922151 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:27Z","lastTransitionTime":"2026-03-07T07:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.923041 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.956568 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.968057 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.980319 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:27 crc kubenswrapper[4738]: I0307 07:01:27.990677 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.019777 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.036299 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.036341 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.036358 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.036378 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.036390 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.067883 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.079004 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.092983 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.100250 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.116650 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.126970 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.138401 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.139853 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.139923 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.139938 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.139962 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.139978 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.149110 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.159991 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.173092 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.185118 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.195401 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.203397 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.211914 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.230812 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.242850 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.242901 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.242910 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.242929 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.242942 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.346036 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.346083 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.346092 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.346108 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.346117 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.449317 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.449378 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.449390 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.449411 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.449425 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.552536 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.552592 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.552639 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.552673 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.552689 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.655530 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.655607 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.655625 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.655653 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.655666 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.758256 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.758338 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.758357 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.758384 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.758434 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.862353 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.862450 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.862471 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.862498 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.862517 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.965713 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.965785 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.965804 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.965833 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:28 crc kubenswrapper[4738]: I0307 07:01:28.965852 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:28Z","lastTransitionTime":"2026-03-07T07:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.069861 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.069928 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.069946 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.069972 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.069990 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.177659 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.177732 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.177755 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.177786 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.177810 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.280196 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.280287 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.280315 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.280366 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.280395 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.383080 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.383146 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.383164 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.383207 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.383222 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.385379 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.385418 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.385599 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.385697 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.385419 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.385815 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.486481 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.486556 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.486580 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.486610 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.486637 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.590287 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.590387 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.590408 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.590439 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.590458 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.693127 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.693207 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.693218 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.693240 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.693252 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.774698 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.774761 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.774778 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.774799 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.774811 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.785848 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.790147 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.790231 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.790244 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.790265 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.790277 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.801763 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.806195 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.806237 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.806246 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.806263 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.806276 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.817643 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.821838 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.821918 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.821939 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.821968 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.821985 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.837234 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.842484 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.842525 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.842537 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.842558 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.842573 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.853620 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:29 crc kubenswrapper[4738]: E0307 07:01:29.853774 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.855474 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.855526 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.855538 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.855552 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.855563 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.960030 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.960131 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.960203 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.960255 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:29 crc kubenswrapper[4738]: I0307 07:01:29.960317 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:29Z","lastTransitionTime":"2026-03-07T07:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.063473 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.063552 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.063566 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.063585 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.063621 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.167138 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.167236 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.167256 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.167281 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.167296 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.269753 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.269797 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.269806 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.269822 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.269831 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.373043 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.373124 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.373143 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.373199 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.373224 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: E0307 07:01:30.387918 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:30 crc kubenswrapper[4738]: E0307 07:01:30.389193 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.484034 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.484113 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.484129 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.484605 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.484651 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.587937 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.588034 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.588046 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.588065 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.588323 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.691713 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.691762 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.691772 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.691792 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.691804 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.794822 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.794876 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.794894 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.794915 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.794929 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.898372 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.898431 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.898444 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.898466 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:30 crc kubenswrapper[4738]: I0307 07:01:30.898481 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:30Z","lastTransitionTime":"2026-03-07T07:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.001132 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.001202 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.001214 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.001229 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.001239 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.104781 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.104847 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.104872 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.104905 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.104932 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.208245 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.208534 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.208585 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.208631 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.208656 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.311661 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.312028 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.312238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.312456 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.312686 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.385443 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.385579 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:31 crc kubenswrapper[4738]: E0307 07:01:31.385648 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.385721 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:31 crc kubenswrapper[4738]: E0307 07:01:31.385746 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:31 crc kubenswrapper[4738]: E0307 07:01:31.385981 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:31 crc kubenswrapper[4738]: E0307 07:01:31.387992 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:31 crc kubenswrapper[4738]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:31 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:31 crc kubenswrapper[4738]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 07 07:01:31 crc kubenswrapper[4738]: source /etc/kubernetes/apiserver-url.env Mar 07 07:01:31 crc kubenswrapper[4738]: else Mar 07 07:01:31 crc kubenswrapper[4738]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 07 07:01:31 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:31 crc kubenswrapper[4738]: fi Mar 07 07:01:31 crc kubenswrapper[4738]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 07 07:01:31 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:31 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:31 crc kubenswrapper[4738]: E0307 07:01:31.389175 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.415508 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.415573 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.415592 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.415614 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.415630 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.518340 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.518398 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.518413 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.518432 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.518445 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.621764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.622079 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.622221 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.622311 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.622372 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.724837 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.724894 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.724908 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.724924 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.724934 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.828839 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.828887 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.828899 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.828928 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.828940 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.932348 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.932408 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.932427 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.932455 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:31 crc kubenswrapper[4738]: I0307 07:01:31.932475 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:31Z","lastTransitionTime":"2026-03-07T07:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.036212 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.036264 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.036274 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.036300 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.036311 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.140331 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.140392 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.140406 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.140429 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.140443 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.244006 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.244064 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.244075 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.244095 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.244109 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.347121 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.347223 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.347243 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.347270 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.347290 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: E0307 07:01:32.387612 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:32 crc kubenswrapper[4738]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:32 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:32 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:32 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:32 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:32 crc kubenswrapper[4738]: fi Mar 07 07:01:32 crc kubenswrapper[4738]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 07 07:01:32 crc kubenswrapper[4738]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 07 07:01:32 crc kubenswrapper[4738]: ho_enable="--enable-hybrid-overlay" Mar 07 07:01:32 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 07 07:01:32 crc kubenswrapper[4738]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 07 07:01:32 crc kubenswrapper[4738]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 07 07:01:32 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:32 crc kubenswrapper[4738]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 07 07:01:32 crc kubenswrapper[4738]: --webhook-host=127.0.0.1 \ Mar 07 07:01:32 crc kubenswrapper[4738]: --webhook-port=9743 \ Mar 07 07:01:32 crc kubenswrapper[4738]: ${ho_enable} \ Mar 07 07:01:32 crc kubenswrapper[4738]: --enable-interconnect \ Mar 07 07:01:32 crc kubenswrapper[4738]: --disable-approver \ Mar 07 07:01:32 crc kubenswrapper[4738]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 07 07:01:32 crc kubenswrapper[4738]: --wait-for-kubernetes-api=200s \ Mar 07 07:01:32 crc kubenswrapper[4738]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 07 07:01:32 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:32 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:32 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:32 crc kubenswrapper[4738]: E0307 07:01:32.391503 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:32 crc kubenswrapper[4738]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:32 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:32 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:32 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:32 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:32 crc kubenswrapper[4738]: fi Mar 07 07:01:32 crc kubenswrapper[4738]: Mar 07 07:01:32 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 07 07:01:32 crc kubenswrapper[4738]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 07 07:01:32 crc kubenswrapper[4738]: --disable-webhook \ Mar 07 07:01:32 crc kubenswrapper[4738]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 07 07:01:32 crc kubenswrapper[4738]: --loglevel="${LOGLEVEL}" Mar 07 07:01:32 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:32 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:32 crc kubenswrapper[4738]: E0307 07:01:32.392713 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.400229 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.412826 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.428098 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.441473 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.450034 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.450477 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.450591 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.450764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.450872 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.459024 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.482703 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.499883 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.514903 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.529249 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.550554 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.554600 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.554654 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.554671 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.554695 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.554711 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.564245 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.578351 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.587480 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.657624 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.657725 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.657746 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.657773 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.657792 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.760699 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.760743 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.760755 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.760775 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.760786 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.809398 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-6tq9s"] Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.809834 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.812402 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.812723 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.812932 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.814482 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.829812 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.838252 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.852630 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.863245 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.863320 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.863341 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.863371 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.863390 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.866032 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.875943 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.892633 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.902497 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.911286 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rb7w\" (UniqueName: \"kubernetes.io/projected/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-kube-api-access-6rb7w\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.911342 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-serviceca\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.911367 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-host\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.930850 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.947688 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.960262 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.966743 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.966814 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.966832 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.966858 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.966880 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:32Z","lastTransitionTime":"2026-03-07T07:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.973672 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:32 crc kubenswrapper[4738]: I0307 07:01:32.994377 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.005860 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.011981 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-host\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.012101 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rb7w\" (UniqueName: \"kubernetes.io/projected/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-kube-api-access-6rb7w\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.012193 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-serviceca\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.012291 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-host\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.014260 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-serviceca\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.017345 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.032239 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rb7w\" (UniqueName: \"kubernetes.io/projected/d81bc0bc-b25f-4d1d-b384-2e220823aa3a-kube-api-access-6rb7w\") pod \"node-ca-6tq9s\" (UID: \"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\") " pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.070633 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.070717 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.070735 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.070760 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.070774 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.122732 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6tq9s" Mar 07 07:01:33 crc kubenswrapper[4738]: W0307 07:01:33.142654 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd81bc0bc_b25f_4d1d_b384_2e220823aa3a.slice/crio-85f59be31a0a8c10497203165abc49c3b8edeb865c52503f33efaf9eb2339129 WatchSource:0}: Error finding container 85f59be31a0a8c10497203165abc49c3b8edeb865c52503f33efaf9eb2339129: Status 404 returned error can't find the container with id 85f59be31a0a8c10497203165abc49c3b8edeb865c52503f33efaf9eb2339129 Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.151232 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:33 crc kubenswrapper[4738]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 07 07:01:33 crc kubenswrapper[4738]: while [ true ]; Mar 07 07:01:33 crc kubenswrapper[4738]: do Mar 07 07:01:33 crc kubenswrapper[4738]: for f in $(ls /tmp/serviceca); do Mar 07 07:01:33 crc kubenswrapper[4738]: echo $f Mar 07 07:01:33 crc kubenswrapper[4738]: ca_file_path="/tmp/serviceca/${f}" Mar 07 07:01:33 crc kubenswrapper[4738]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 07 07:01:33 crc kubenswrapper[4738]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 07 07:01:33 crc kubenswrapper[4738]: if [ -e "${reg_dir_path}" ]; then Mar 07 07:01:33 crc kubenswrapper[4738]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 07 07:01:33 crc kubenswrapper[4738]: else Mar 07 07:01:33 crc kubenswrapper[4738]: mkdir $reg_dir_path Mar 07 07:01:33 crc kubenswrapper[4738]: cp $ca_file_path $reg_dir_path/ca.crt Mar 07 07:01:33 crc kubenswrapper[4738]: fi Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: for d in $(ls /etc/docker/certs.d); do Mar 07 07:01:33 crc kubenswrapper[4738]: echo $d Mar 07 07:01:33 crc kubenswrapper[4738]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 07 07:01:33 crc kubenswrapper[4738]: reg_conf_path="/tmp/serviceca/${dp}" Mar 07 07:01:33 crc kubenswrapper[4738]: if [ ! -e "${reg_conf_path}" ]; then Mar 07 07:01:33 crc kubenswrapper[4738]: rm -rf /etc/docker/certs.d/$d Mar 07 07:01:33 crc kubenswrapper[4738]: fi Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: sleep 60 & wait ${!} Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rb7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-6tq9s_openshift-image-registry(d81bc0bc-b25f-4d1d-b384-2e220823aa3a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:33 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.152491 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-6tq9s" podUID="d81bc0bc-b25f-4d1d-b384-2e220823aa3a" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.174143 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.174651 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.174679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.174717 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.174742 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.279112 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.279210 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.279228 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.279254 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.279274 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.383107 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.383206 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.383227 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.383251 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.383267 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.385612 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.385708 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.385773 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.385606 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.385947 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.386051 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.417012 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.417208 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.417378 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:01:49.417328521 +0000 UTC m=+127.882315882 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.417411 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.417523 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:49.417496226 +0000 UTC m=+127.882483587 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.417581 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.417774 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.417862 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:49.417847147 +0000 UTC m=+127.882834508 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.487257 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.487336 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.487357 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.487392 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.487492 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.518794 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.518879 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519044 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519066 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519080 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519158 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:49.519135092 +0000 UTC m=+127.984122423 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519209 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519327 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519348 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.519456 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:01:49.519421561 +0000 UTC m=+127.984408892 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.589696 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.589744 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.589758 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.589778 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.589793 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.692896 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.692993 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.693025 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.693059 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.693083 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.796143 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.796205 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.796214 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.796229 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.796240 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.850467 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6tq9s" event={"ID":"d81bc0bc-b25f-4d1d-b384-2e220823aa3a","Type":"ContainerStarted","Data":"85f59be31a0a8c10497203165abc49c3b8edeb865c52503f33efaf9eb2339129"} Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.852901 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:33 crc kubenswrapper[4738]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 07 07:01:33 crc kubenswrapper[4738]: while [ true ]; Mar 07 07:01:33 crc kubenswrapper[4738]: do Mar 07 07:01:33 crc kubenswrapper[4738]: for f in $(ls /tmp/serviceca); do Mar 07 07:01:33 crc kubenswrapper[4738]: echo $f Mar 07 07:01:33 crc kubenswrapper[4738]: ca_file_path="/tmp/serviceca/${f}" Mar 07 07:01:33 crc kubenswrapper[4738]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 07 07:01:33 crc kubenswrapper[4738]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 07 07:01:33 crc kubenswrapper[4738]: if [ -e "${reg_dir_path}" ]; then Mar 07 07:01:33 crc kubenswrapper[4738]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 07 07:01:33 crc kubenswrapper[4738]: else Mar 07 07:01:33 crc kubenswrapper[4738]: mkdir $reg_dir_path Mar 07 07:01:33 crc kubenswrapper[4738]: cp $ca_file_path $reg_dir_path/ca.crt Mar 07 07:01:33 crc kubenswrapper[4738]: fi Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: for d in $(ls /etc/docker/certs.d); do Mar 07 07:01:33 crc kubenswrapper[4738]: echo $d Mar 07 07:01:33 crc kubenswrapper[4738]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 07 07:01:33 crc kubenswrapper[4738]: reg_conf_path="/tmp/serviceca/${dp}" Mar 07 07:01:33 crc kubenswrapper[4738]: if [ ! -e "${reg_conf_path}" ]; then Mar 07 07:01:33 crc kubenswrapper[4738]: rm -rf /etc/docker/certs.d/$d Mar 07 07:01:33 crc kubenswrapper[4738]: fi Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: sleep 60 & wait ${!} Mar 07 07:01:33 crc kubenswrapper[4738]: done Mar 07 07:01:33 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6rb7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-6tq9s_openshift-image-registry(d81bc0bc-b25f-4d1d-b384-2e220823aa3a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:33 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:33 crc kubenswrapper[4738]: E0307 07:01:33.854117 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-6tq9s" podUID="d81bc0bc-b25f-4d1d-b384-2e220823aa3a" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.869341 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.886092 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.897821 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.902319 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.902426 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.902446 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.902476 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.902529 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:33Z","lastTransitionTime":"2026-03-07T07:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.916969 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.932570 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.944435 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.959118 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.974540 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:33 crc kubenswrapper[4738]: I0307 07:01:33.994833 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.006367 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.006436 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.006459 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.006489 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.006512 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.012905 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.025490 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.039087 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.054862 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.080284 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.109235 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.109296 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.109323 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.109347 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.109366 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.212892 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.212958 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.212976 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.213004 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.213023 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.316417 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.316468 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.316485 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.316514 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.316532 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.420193 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.420539 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.420627 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.420715 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.420785 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.523279 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.523326 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.523342 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.523362 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.523375 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.627599 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.627640 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.627649 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.627663 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.627672 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.730206 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.730313 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.730345 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.730379 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.730408 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.832726 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.832764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.832776 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.832794 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.832808 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.936874 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.936911 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.936930 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.936950 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:34 crc kubenswrapper[4738]: I0307 07:01:34.936963 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:34Z","lastTransitionTime":"2026-03-07T07:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.040210 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.040277 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.040288 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.040307 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.040320 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.143917 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.143981 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.143997 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.144023 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.144040 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.247467 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.247540 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.247557 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.247585 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.247606 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.350900 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.350966 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.350983 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.351010 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.351033 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.385706 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.385758 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.385708 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:35 crc kubenswrapper[4738]: E0307 07:01:35.385915 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:35 crc kubenswrapper[4738]: E0307 07:01:35.386057 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:35 crc kubenswrapper[4738]: E0307 07:01:35.386146 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.453558 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.453609 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.453619 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.453635 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.453644 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.556698 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.556811 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.556839 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.556870 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.556896 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.660321 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.660407 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.660434 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.660465 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.660488 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.763845 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.763909 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.763925 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.763949 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.763966 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.866895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.866974 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.866996 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.867025 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.867046 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.970307 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.970424 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.970450 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.970483 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:35 crc kubenswrapper[4738]: I0307 07:01:35.970503 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:35Z","lastTransitionTime":"2026-03-07T07:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.074136 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.074225 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.074238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.074255 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.074266 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.177812 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.177870 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.177882 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.177904 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.177923 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.281427 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.281499 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.281512 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.281534 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.281550 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.385554 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.385664 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.385736 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.385776 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.385807 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.488981 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.489019 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.489027 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.489040 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.489050 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.592384 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.592477 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.592510 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.592545 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.592566 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.695702 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.695764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.695775 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.695793 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.695814 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.799599 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.799670 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.799691 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.799723 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.799747 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.903824 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.903897 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.903916 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.903947 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:36 crc kubenswrapper[4738]: I0307 07:01:36.903967 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:36Z","lastTransitionTime":"2026-03-07T07:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.007340 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.007413 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.007433 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.007466 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.007486 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.112206 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.112269 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.112552 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.112584 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.112604 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.216447 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.216518 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.216537 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.216562 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.216601 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.320328 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.320400 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.320424 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.320452 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.320471 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.385373 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.385417 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.385451 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:37 crc kubenswrapper[4738]: E0307 07:01:37.385586 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:37 crc kubenswrapper[4738]: E0307 07:01:37.385797 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:37 crc kubenswrapper[4738]: E0307 07:01:37.385947 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.424582 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.424643 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.424664 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.424702 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.424722 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.528849 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.528920 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.528938 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.528965 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.528985 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.632466 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.632528 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.632545 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.632577 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.632601 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.736249 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.736326 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.736349 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.736376 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.736395 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.839856 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.839904 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.839913 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.839933 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.839944 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.947080 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.947142 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.947190 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.947216 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:37 crc kubenswrapper[4738]: I0307 07:01:37.947231 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:37Z","lastTransitionTime":"2026-03-07T07:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.050899 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.050983 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.051002 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.051030 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.051049 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.154738 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.154803 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.154822 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.154849 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.154868 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.259244 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.259325 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.259347 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.259378 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.259399 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.362525 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.362612 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.362630 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.362656 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.362675 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: E0307 07:01:38.388524 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zn2s9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-fmp5z_openshift-multus(3c725cf7-39ed-4a27-abf6-8e8346cc6ebd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:38 crc kubenswrapper[4738]: E0307 07:01:38.389893 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" podUID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.466015 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.466182 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.466209 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.466274 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.466295 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.569946 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.570018 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.570041 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.570068 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.570088 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.674396 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.674464 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.674482 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.674515 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.674547 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.680386 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk"] Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.681881 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.684764 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.685000 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.704557 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.723718 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.742403 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.761678 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.773588 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778402 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778471 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778490 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778526 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778549 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778595 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.778678 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.779019 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd2z5\" (UniqueName: \"kubernetes.io/projected/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-kube-api-access-zd2z5\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.779318 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.786349 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.799493 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.812299 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.827464 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.840859 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.852997 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.876084 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.880252 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.880341 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.880384 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.880431 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd2z5\" (UniqueName: \"kubernetes.io/projected/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-kube-api-access-zd2z5\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881355 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881403 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881422 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881435 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-env-overrides\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881449 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.881551 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.882002 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.884787 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.890262 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.898236 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd2z5\" (UniqueName: \"kubernetes.io/projected/0211010e-3f22-4ac1-a9a4-bccb4552a2b6-kube-api-access-zd2z5\") pod \"ovnkube-control-plane-749d76644c-hlnrk\" (UID: \"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.901445 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.912667 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.984244 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.984298 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.984315 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.984339 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:38 crc kubenswrapper[4738]: I0307 07:01:38.984355 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:38Z","lastTransitionTime":"2026-03-07T07:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.002530 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" Mar 07 07:01:39 crc kubenswrapper[4738]: W0307 07:01:39.022683 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0211010e_3f22_4ac1_a9a4_bccb4552a2b6.slice/crio-67cc07682741f438ae10b8e80e9f37627544d18ea1eb299e38e98e7156a8865c WatchSource:0}: Error finding container 67cc07682741f438ae10b8e80e9f37627544d18ea1eb299e38e98e7156a8865c: Status 404 returned error can't find the container with id 67cc07682741f438ae10b8e80e9f37627544d18ea1eb299e38e98e7156a8865c Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.026836 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:39 crc kubenswrapper[4738]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:39 crc kubenswrapper[4738]: set -euo pipefail Mar 07 07:01:39 crc kubenswrapper[4738]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 07 07:01:39 crc kubenswrapper[4738]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 07 07:01:39 crc kubenswrapper[4738]: # As the secret mount is optional we must wait for the files to be present. Mar 07 07:01:39 crc kubenswrapper[4738]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 07 07:01:39 crc kubenswrapper[4738]: TS=$(date +%s) Mar 07 07:01:39 crc kubenswrapper[4738]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 07 07:01:39 crc kubenswrapper[4738]: HAS_LOGGED_INFO=0 Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: log_missing_certs(){ Mar 07 07:01:39 crc kubenswrapper[4738]: CUR_TS=$(date +%s) Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 07 07:01:39 crc kubenswrapper[4738]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 07 07:01:39 crc kubenswrapper[4738]: HAS_LOGGED_INFO=1 Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: } Mar 07 07:01:39 crc kubenswrapper[4738]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 07 07:01:39 crc kubenswrapper[4738]: log_missing_certs Mar 07 07:01:39 crc kubenswrapper[4738]: sleep 5 Mar 07 07:01:39 crc kubenswrapper[4738]: done Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 07 07:01:39 crc kubenswrapper[4738]: exec /usr/bin/kube-rbac-proxy \ Mar 07 07:01:39 crc kubenswrapper[4738]: --logtostderr \ Mar 07 07:01:39 crc kubenswrapper[4738]: --secure-listen-address=:9108 \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 07 07:01:39 crc kubenswrapper[4738]: --upstream=http://127.0.0.1:29108/ \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-private-key-file=${TLS_PK} \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-cert-file=${TLS_CERT} Mar 07 07:01:39 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd2z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-hlnrk_openshift-ovn-kubernetes(0211010e-3f22-4ac1-a9a4-bccb4552a2b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:39 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.029425 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:39 crc kubenswrapper[4738]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:39 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:39 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_join_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_join_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_transit_switch_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_transit_switch_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: dns_name_resolver_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "false" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: persistent_ips_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "true" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: # This is needed so that converting clusters from GA to TP Mar 07 07:01:39 crc kubenswrapper[4738]: # will rollout control plane pods as well Mar 07 07:01:39 crc kubenswrapper[4738]: network_segmentation_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: multi_network_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "true" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: multi_network_enabled_flag="--enable-multi-network" Mar 07 07:01:39 crc kubenswrapper[4738]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 07 07:01:39 crc kubenswrapper[4738]: exec /usr/bin/ovnkube \ Mar 07 07:01:39 crc kubenswrapper[4738]: --enable-interconnect \ Mar 07 07:01:39 crc kubenswrapper[4738]: --init-cluster-manager "${K8S_NODE}" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 07 07:01:39 crc kubenswrapper[4738]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-bind-address "127.0.0.1:29108" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-enable-pprof \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-enable-config-duration \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v4_join_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v6_join_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${dns_name_resolver_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${persistent_ips_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${multi_network_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${network_segmentation_enabled_flag} Mar 07 07:01:39 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd2z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-hlnrk_openshift-ovn-kubernetes(0211010e-3f22-4ac1-a9a4-bccb4552a2b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:39 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.030743 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" podUID="0211010e-3f22-4ac1-a9a4-bccb4552a2b6" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.088188 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.088319 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.088348 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.088372 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.088388 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.191569 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.191643 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.191669 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.191703 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.191722 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.294916 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.294991 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.295014 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.295042 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.295061 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.385332 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.385399 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.385399 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.385577 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.385755 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.385950 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.399635 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.399717 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.399739 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.399772 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.399790 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.413850 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-qpkbn"] Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.415585 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.415705 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.440967 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.455632 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.466798 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.480885 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.486635 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.486729 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77lnm\" (UniqueName: \"kubernetes.io/projected/ba7ca967-58f7-4944-81d8-7bb8957707ad-kube-api-access-77lnm\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.490071 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.502208 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.502263 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.502276 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.502294 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.502306 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.504361 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.519637 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.542458 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.564451 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.579495 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.587790 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.587857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77lnm\" (UniqueName: \"kubernetes.io/projected/ba7ca967-58f7-4944-81d8-7bb8957707ad-kube-api-access-77lnm\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.588048 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.588134 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:01:40.088109593 +0000 UTC m=+118.553096914 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.593931 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.602598 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.604091 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.604177 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.604191 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.604212 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.604227 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.606353 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77lnm\" (UniqueName: \"kubernetes.io/projected/ba7ca967-58f7-4944-81d8-7bb8957707ad-kube-api-access-77lnm\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.620242 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.631708 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.643611 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.660241 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.706960 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.707035 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.707052 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.707081 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.707100 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.810919 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.810985 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.810994 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.811013 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.811029 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.871304 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" event={"ID":"0211010e-3f22-4ac1-a9a4-bccb4552a2b6","Type":"ContainerStarted","Data":"67cc07682741f438ae10b8e80e9f37627544d18ea1eb299e38e98e7156a8865c"} Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.873670 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:39 crc kubenswrapper[4738]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:39 crc kubenswrapper[4738]: set -euo pipefail Mar 07 07:01:39 crc kubenswrapper[4738]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 07 07:01:39 crc kubenswrapper[4738]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 07 07:01:39 crc kubenswrapper[4738]: # As the secret mount is optional we must wait for the files to be present. Mar 07 07:01:39 crc kubenswrapper[4738]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 07 07:01:39 crc kubenswrapper[4738]: TS=$(date +%s) Mar 07 07:01:39 crc kubenswrapper[4738]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 07 07:01:39 crc kubenswrapper[4738]: HAS_LOGGED_INFO=0 Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: log_missing_certs(){ Mar 07 07:01:39 crc kubenswrapper[4738]: CUR_TS=$(date +%s) Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 07 07:01:39 crc kubenswrapper[4738]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 07 07:01:39 crc kubenswrapper[4738]: HAS_LOGGED_INFO=1 Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: } Mar 07 07:01:39 crc kubenswrapper[4738]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 07 07:01:39 crc kubenswrapper[4738]: log_missing_certs Mar 07 07:01:39 crc kubenswrapper[4738]: sleep 5 Mar 07 07:01:39 crc kubenswrapper[4738]: done Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 07 07:01:39 crc kubenswrapper[4738]: exec /usr/bin/kube-rbac-proxy \ Mar 07 07:01:39 crc kubenswrapper[4738]: --logtostderr \ Mar 07 07:01:39 crc kubenswrapper[4738]: --secure-listen-address=:9108 \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 07 07:01:39 crc kubenswrapper[4738]: --upstream=http://127.0.0.1:29108/ \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-private-key-file=${TLS_PK} \ Mar 07 07:01:39 crc kubenswrapper[4738]: --tls-cert-file=${TLS_CERT} Mar 07 07:01:39 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd2z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-hlnrk_openshift-ovn-kubernetes(0211010e-3f22-4ac1-a9a4-bccb4552a2b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:39 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.876308 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:39 crc kubenswrapper[4738]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ -f "/env/_master" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: set -o allexport Mar 07 07:01:39 crc kubenswrapper[4738]: source "/env/_master" Mar 07 07:01:39 crc kubenswrapper[4738]: set +o allexport Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_join_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_join_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_transit_switch_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_transit_switch_subnet_opt= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "" != "" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: dns_name_resolver_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "false" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: persistent_ips_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "true" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: # This is needed so that converting clusters from GA to TP Mar 07 07:01:39 crc kubenswrapper[4738]: # will rollout control plane pods as well Mar 07 07:01:39 crc kubenswrapper[4738]: network_segmentation_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: multi_network_enabled_flag= Mar 07 07:01:39 crc kubenswrapper[4738]: if [[ "true" == "true" ]]; then Mar 07 07:01:39 crc kubenswrapper[4738]: multi_network_enabled_flag="--enable-multi-network" Mar 07 07:01:39 crc kubenswrapper[4738]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 07 07:01:39 crc kubenswrapper[4738]: fi Mar 07 07:01:39 crc kubenswrapper[4738]: Mar 07 07:01:39 crc kubenswrapper[4738]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 07 07:01:39 crc kubenswrapper[4738]: exec /usr/bin/ovnkube \ Mar 07 07:01:39 crc kubenswrapper[4738]: --enable-interconnect \ Mar 07 07:01:39 crc kubenswrapper[4738]: --init-cluster-manager "${K8S_NODE}" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 07 07:01:39 crc kubenswrapper[4738]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-bind-address "127.0.0.1:29108" \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-enable-pprof \ Mar 07 07:01:39 crc kubenswrapper[4738]: --metrics-enable-config-duration \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v4_join_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v6_join_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${dns_name_resolver_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${persistent_ips_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${multi_network_enabled_flag} \ Mar 07 07:01:39 crc kubenswrapper[4738]: ${network_segmentation_enabled_flag} Mar 07 07:01:39 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd2z5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-hlnrk_openshift-ovn-kubernetes(0211010e-3f22-4ac1-a9a4-bccb4552a2b6): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:39 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:39 crc kubenswrapper[4738]: E0307 07:01:39.877492 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" podUID="0211010e-3f22-4ac1-a9a4-bccb4552a2b6" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.883655 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.901736 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.914591 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.914668 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.914683 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.914710 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.914725 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:39Z","lastTransitionTime":"2026-03-07T07:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.920428 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.935266 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.950727 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.963284 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.974000 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.984448 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:39 crc kubenswrapper[4738]: I0307 07:01:39.995898 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.009032 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.017641 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.017678 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.017686 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.017703 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.017714 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.023376 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.037729 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.059604 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.072205 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.090407 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.093941 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.094094 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.094215 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:01:41.094192939 +0000 UTC m=+119.559180270 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.103309 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.120987 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.121035 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.121049 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.121075 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.121091 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.164582 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.164630 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.164651 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.164676 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.164696 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.179452 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.184334 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.184364 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.184376 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.184392 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.184406 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.196693 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.200853 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.200895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.200907 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.200925 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.200937 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.210778 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.214927 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.214956 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.214967 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.214981 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.214993 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.227265 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.231174 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.231219 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.231234 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.231260 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.231275 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.242581 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.242751 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.244670 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.244715 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.244724 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.244746 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.244762 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.348611 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.348683 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.348696 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.348721 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.348735 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.386499 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.387612 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:40 crc kubenswrapper[4738]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 07 07:01:40 crc kubenswrapper[4738]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 07 07:01:40 crc kubenswrapper[4738]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-622rm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:40 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:40 crc kubenswrapper[4738]: E0307 07:01:40.389025 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.452233 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.452284 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.452297 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.452322 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.452338 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.555023 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.555064 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.555074 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.555092 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.555105 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.658652 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.659142 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.659154 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.659197 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.659209 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.761588 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.761625 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.761636 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.761653 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.761664 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.865963 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.866033 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.866057 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.866099 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.866122 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.877627 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.880234 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.881019 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.895294 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.906064 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.916308 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.924914 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.946745 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.970406 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.970483 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.970507 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.970539 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.970563 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:40Z","lastTransitionTime":"2026-03-07T07:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.971250 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.985272 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:40 crc kubenswrapper[4738]: I0307 07:01:40.997281 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.007445 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.031699 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.043836 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.060375 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.073421 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.073932 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.073962 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.073974 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.073993 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.074005 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.089354 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.105793 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.105981 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.106072 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:01:43.106048701 +0000 UTC m=+121.571036042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.117726 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.128048 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.176387 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.176448 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.176465 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.176488 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.176503 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.279935 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.279988 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.279999 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.280014 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.280059 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.383830 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.383907 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.383934 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.383969 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.383996 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.385014 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.385187 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.385269 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.385370 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.385487 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.385582 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.385719 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.385969 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.387675 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:41 crc kubenswrapper[4738]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 07 07:01:41 crc kubenswrapper[4738]: apiVersion: v1 Mar 07 07:01:41 crc kubenswrapper[4738]: clusters: Mar 07 07:01:41 crc kubenswrapper[4738]: - cluster: Mar 07 07:01:41 crc kubenswrapper[4738]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 07 07:01:41 crc kubenswrapper[4738]: server: https://api-int.crc.testing:6443 Mar 07 07:01:41 crc kubenswrapper[4738]: name: default-cluster Mar 07 07:01:41 crc kubenswrapper[4738]: contexts: Mar 07 07:01:41 crc kubenswrapper[4738]: - context: Mar 07 07:01:41 crc kubenswrapper[4738]: cluster: default-cluster Mar 07 07:01:41 crc kubenswrapper[4738]: namespace: default Mar 07 07:01:41 crc kubenswrapper[4738]: user: default-auth Mar 07 07:01:41 crc kubenswrapper[4738]: name: default-context Mar 07 07:01:41 crc kubenswrapper[4738]: current-context: default-context Mar 07 07:01:41 crc kubenswrapper[4738]: kind: Config Mar 07 07:01:41 crc kubenswrapper[4738]: preferences: {} Mar 07 07:01:41 crc kubenswrapper[4738]: users: Mar 07 07:01:41 crc kubenswrapper[4738]: - name: default-auth Mar 07 07:01:41 crc kubenswrapper[4738]: user: Mar 07 07:01:41 crc kubenswrapper[4738]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:41 crc kubenswrapper[4738]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 07 07:01:41 crc kubenswrapper[4738]: EOF Mar 07 07:01:41 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v46jv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:41 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.388098 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.389505 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.390595 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwsww,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 07 07:01:41 crc kubenswrapper[4738]: E0307 07:01:41.392127 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.487806 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.487882 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.487906 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.487939 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.487965 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.591349 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.591401 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.591416 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.591435 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.591451 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.694941 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.695008 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.695026 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.695058 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.695078 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.798685 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.798744 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.798759 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.798780 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.798793 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.902533 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.902609 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.902633 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.902666 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:41 crc kubenswrapper[4738]: I0307 07:01:41.902693 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:41Z","lastTransitionTime":"2026-03-07T07:01:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.006085 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.006205 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.006238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.006270 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.006290 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:42Z","lastTransitionTime":"2026-03-07T07:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.110586 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.110666 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.110684 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.110713 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.110732 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:42Z","lastTransitionTime":"2026-03-07T07:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.215129 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.215206 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.215219 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.215240 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.215253 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:42Z","lastTransitionTime":"2026-03-07T07:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.318120 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.318211 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.318226 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.318254 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.318268 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:42Z","lastTransitionTime":"2026-03-07T07:01:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:42 crc kubenswrapper[4738]: E0307 07:01:42.387823 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:01:42 crc kubenswrapper[4738]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 07 07:01:42 crc kubenswrapper[4738]: set -uo pipefail Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 07 07:01:42 crc kubenswrapper[4738]: HOSTS_FILE="/etc/hosts" Mar 07 07:01:42 crc kubenswrapper[4738]: TEMP_FILE="/etc/hosts.tmp" Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: # Make a temporary file with the old hosts file's attributes. Mar 07 07:01:42 crc kubenswrapper[4738]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 07 07:01:42 crc kubenswrapper[4738]: echo "Failed to preserve hosts file. Exiting." Mar 07 07:01:42 crc kubenswrapper[4738]: exit 1 Mar 07 07:01:42 crc kubenswrapper[4738]: fi Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: while true; do Mar 07 07:01:42 crc kubenswrapper[4738]: declare -A svc_ips Mar 07 07:01:42 crc kubenswrapper[4738]: for svc in "${services[@]}"; do Mar 07 07:01:42 crc kubenswrapper[4738]: # Fetch service IP from cluster dns if present. We make several tries Mar 07 07:01:42 crc kubenswrapper[4738]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 07 07:01:42 crc kubenswrapper[4738]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 07 07:01:42 crc kubenswrapper[4738]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 07 07:01:42 crc kubenswrapper[4738]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:42 crc kubenswrapper[4738]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:42 crc kubenswrapper[4738]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 07 07:01:42 crc kubenswrapper[4738]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 07 07:01:42 crc kubenswrapper[4738]: for i in ${!cmds[*]} Mar 07 07:01:42 crc kubenswrapper[4738]: do Mar 07 07:01:42 crc kubenswrapper[4738]: ips=($(eval "${cmds[i]}")) Mar 07 07:01:42 crc kubenswrapper[4738]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 07 07:01:42 crc kubenswrapper[4738]: svc_ips["${svc}"]="${ips[@]}" Mar 07 07:01:42 crc kubenswrapper[4738]: break Mar 07 07:01:42 crc kubenswrapper[4738]: fi Mar 07 07:01:42 crc kubenswrapper[4738]: done Mar 07 07:01:42 crc kubenswrapper[4738]: done Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: # Update /etc/hosts only if we get valid service IPs Mar 07 07:01:42 crc kubenswrapper[4738]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 07 07:01:42 crc kubenswrapper[4738]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 07 07:01:42 crc kubenswrapper[4738]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 07 07:01:42 crc kubenswrapper[4738]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 07 07:01:42 crc kubenswrapper[4738]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 07 07:01:42 crc kubenswrapper[4738]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 07 07:01:42 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:42 crc kubenswrapper[4738]: continue Mar 07 07:01:42 crc kubenswrapper[4738]: fi Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: # Append resolver entries for services Mar 07 07:01:42 crc kubenswrapper[4738]: rc=0 Mar 07 07:01:42 crc kubenswrapper[4738]: for svc in "${!svc_ips[@]}"; do Mar 07 07:01:42 crc kubenswrapper[4738]: for ip in ${svc_ips[${svc}]}; do Mar 07 07:01:42 crc kubenswrapper[4738]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 07 07:01:42 crc kubenswrapper[4738]: done Mar 07 07:01:42 crc kubenswrapper[4738]: done Mar 07 07:01:42 crc kubenswrapper[4738]: if [[ $rc -ne 0 ]]; then Mar 07 07:01:42 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:42 crc kubenswrapper[4738]: continue Mar 07 07:01:42 crc kubenswrapper[4738]: fi Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: Mar 07 07:01:42 crc kubenswrapper[4738]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 07 07:01:42 crc kubenswrapper[4738]: # Replace /etc/hosts with our modified version if needed Mar 07 07:01:42 crc kubenswrapper[4738]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 07 07:01:42 crc kubenswrapper[4738]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 07 07:01:42 crc kubenswrapper[4738]: fi Mar 07 07:01:42 crc kubenswrapper[4738]: sleep 60 & wait Mar 07 07:01:42 crc kubenswrapper[4738]: unset svc_ips Mar 07 07:01:42 crc kubenswrapper[4738]: done Mar 07 07:01:42 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zllrn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-mbtvs_openshift-dns(ed4db713-ac09-4c8e-ab4c-f9031c78d476): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 07 07:01:42 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:01:42 crc kubenswrapper[4738]: E0307 07:01:42.388996 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-mbtvs" podUID="ed4db713-ac09-4c8e-ab4c-f9031c78d476" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.397021 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.408670 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: E0307 07:01:42.419123 4738 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.422219 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.437112 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.447819 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.466646 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.478908 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.493373 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.504379 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.516390 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.526268 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.537212 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.549202 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.557812 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.570121 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.586032 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:42 crc kubenswrapper[4738]: I0307 07:01:42.914098 4738 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 07:01:43 crc kubenswrapper[4738]: I0307 07:01:43.130473 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.130697 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.130837 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:01:47.130795506 +0000 UTC m=+125.595782877 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:43 crc kubenswrapper[4738]: I0307 07:01:43.385459 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:43 crc kubenswrapper[4738]: I0307 07:01:43.385587 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:43 crc kubenswrapper[4738]: I0307 07:01:43.385615 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.385688 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:43 crc kubenswrapper[4738]: I0307 07:01:43.385796 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.385787 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.385919 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.386238 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:43 crc kubenswrapper[4738]: E0307 07:01:43.576548 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.897426 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6tq9s" event={"ID":"d81bc0bc-b25f-4d1d-b384-2e220823aa3a","Type":"ContainerStarted","Data":"8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4"} Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.899616 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124"} Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.911877 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.942886 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.962845 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:44 crc kubenswrapper[4738]: I0307 07:01:44.984010 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.004121 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.019545 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.041288 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.056519 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.073110 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.084105 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.099840 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.122276 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.139911 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.152216 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.162468 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.173845 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.189302 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.204004 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.225466 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.240339 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.255760 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.274104 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.288891 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.302829 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.316230 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.329452 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.338382 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.351067 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.371484 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.384789 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.384851 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.384967 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:45 crc kubenswrapper[4738]: E0307 07:01:45.384984 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:45 crc kubenswrapper[4738]: E0307 07:01:45.385108 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.385214 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:45 crc kubenswrapper[4738]: E0307 07:01:45.385426 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:45 crc kubenswrapper[4738]: E0307 07:01:45.385732 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.389853 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.400410 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.419304 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.904360 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a"} Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.904418 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6"} Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.916780 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.935827 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.950122 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.965530 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.984975 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:45 crc kubenswrapper[4738]: I0307 07:01:45.995320 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.020918 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.035428 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.048621 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.061709 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.083715 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.098684 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:46Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.114394 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:46Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.131254 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:46Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.188509 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:46Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:46 crc kubenswrapper[4738]: I0307 07:01:46.207953 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:46Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.181778 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.182004 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.182132 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:01:55.182104949 +0000 UTC m=+133.647092280 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.384812 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.384878 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.384926 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.385010 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.385126 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.385384 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.385506 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:47 crc kubenswrapper[4738]: E0307 07:01:47.385657 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.912451 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188"} Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.935143 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:47Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.954583 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:47Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:47 crc kubenswrapper[4738]: I0307 07:01:47.973686 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:47Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.000865 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:47Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.015014 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.041065 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.060042 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.084413 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.101493 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.116655 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.137683 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.159021 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.178230 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.195835 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.214295 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: I0307 07:01:48.239334 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:48Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:48 crc kubenswrapper[4738]: E0307 07:01:48.577459 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.385311 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.385374 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.385456 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.385538 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.385562 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.385489 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.385723 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.385838 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.511192 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.511442 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:02:21.511401879 +0000 UTC m=+159.976389240 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.511636 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.511715 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.511825 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.511929 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:02:21.511909174 +0000 UTC m=+159.976896535 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.512041 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.512220 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:02:21.512191443 +0000 UTC m=+159.977178794 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.612950 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:49 crc kubenswrapper[4738]: I0307 07:01:49.613058 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613246 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613286 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613307 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613376 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:02:21.613353934 +0000 UTC m=+160.078341275 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613246 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613416 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613434 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:49 crc kubenswrapper[4738]: E0307 07:01:49.613484 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:02:21.613466377 +0000 UTC m=+160.078453708 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.371364 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.371430 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.371439 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.371457 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.371469 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:50Z","lastTransitionTime":"2026-03-07T07:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.390140 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:50Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.394686 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.394757 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.394773 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.394813 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.394833 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:50Z","lastTransitionTime":"2026-03-07T07:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.411851 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:50Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.417811 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.417874 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.417895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.417921 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.417941 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:50Z","lastTransitionTime":"2026-03-07T07:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.436849 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:50Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.442176 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.442237 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.442251 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.442275 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.442294 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:50Z","lastTransitionTime":"2026-03-07T07:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.459350 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:50Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.468530 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.468687 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.468804 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.468912 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:01:50 crc kubenswrapper[4738]: I0307 07:01:50.469013 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:01:50Z","lastTransitionTime":"2026-03-07T07:01:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.488237 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:50Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:50 crc kubenswrapper[4738]: E0307 07:01:50.488386 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.384915 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.384962 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:51 crc kubenswrapper[4738]: E0307 07:01:51.385101 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.385253 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:51 crc kubenswrapper[4738]: E0307 07:01:51.385491 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.385866 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:51 crc kubenswrapper[4738]: E0307 07:01:51.386194 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:51 crc kubenswrapper[4738]: E0307 07:01:51.386317 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.927374 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366" exitCode=0 Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.927430 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366"} Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.945313 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:51Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:51 crc kubenswrapper[4738]: I0307 07:01:51.962778 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:51Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.001392 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:51Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.023592 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.038673 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.052135 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.065050 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.082854 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.101505 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.116438 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.129854 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.143746 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.166511 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.180628 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.193933 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.206187 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.399650 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.433100 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.459210 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.477399 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.496573 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.511240 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.534130 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.554802 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.580709 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.615243 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: E0307 07:01:52.655463 4738 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c725cf7_39ed_4a27_abf6_8e8346cc6ebd.slice/crio-d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.657601 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.681591 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.695305 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.709495 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.725583 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.740321 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.933708 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" exitCode=0 Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.933824 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.936987 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd" exitCode=0 Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.937063 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd"} Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.957582 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.975509 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:52 crc kubenswrapper[4738]: I0307 07:01:52.994187 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.015703 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.037559 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.060238 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.079677 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.102455 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.127184 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.144356 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.160438 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.183588 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.195747 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.221326 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.237669 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.248376 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.263866 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.283681 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.302241 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.318974 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.336255 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.358823 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.375174 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.387808 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.387825 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.387897 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.387966 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:53 crc kubenswrapper[4738]: E0307 07:01:53.388175 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:53 crc kubenswrapper[4738]: E0307 07:01:53.388307 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:53 crc kubenswrapper[4738]: E0307 07:01:53.388364 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:53 crc kubenswrapper[4738]: E0307 07:01:53.388848 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.396908 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.398881 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.408598 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.430580 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.442835 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.468218 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.484344 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.499403 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.515522 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.535987 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:53 crc kubenswrapper[4738]: E0307 07:01:53.579498 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947429 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947506 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947527 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947546 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947564 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.947585 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.951477 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0" exitCode=0 Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.951579 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0"} Mar 07 07:01:53 crc kubenswrapper[4738]: I0307 07:01:53.973144 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.003280 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.019680 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.041543 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.067361 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.083623 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.101753 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.121520 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.136237 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.157941 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.180514 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.209087 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.224444 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.237750 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.258943 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.274525 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.289985 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.305040 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.323265 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.339181 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.359404 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.379377 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.398399 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.421469 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.439003 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.472792 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.494004 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.517949 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.537404 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.560366 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.576127 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.596342 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.614630 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.634888 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.650429 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.961189 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29" exitCode=0 Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.961296 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.964426 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.964493 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.968376 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" event={"ID":"0211010e-3f22-4ac1-a9a4-bccb4552a2b6","Type":"ContainerStarted","Data":"bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.968417 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" event={"ID":"0211010e-3f22-4ac1-a9a4-bccb4552a2b6","Type":"ContainerStarted","Data":"3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.969957 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerStarted","Data":"a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f"} Mar 07 07:01:54 crc kubenswrapper[4738]: I0307 07:01:54.983750 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.003005 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.022493 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.038463 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.064267 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.078509 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.092971 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.111877 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.136310 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.151081 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.170421 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.185153 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.185340 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.185435 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:02:11.185403289 +0000 UTC m=+149.650390610 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.186008 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.195346 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.206886 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.217013 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.226070 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.236354 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.253216 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.265196 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.284400 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.300321 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.311944 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.324838 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.335532 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.345965 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.359353 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.372426 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.384429 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.384849 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.384930 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.384989 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.385065 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.384940 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.385270 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.385455 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:55 crc kubenswrapper[4738]: E0307 07:01:55.385703 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.403630 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.425240 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.453666 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.485744 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.503600 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.528258 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:55Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.980241 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066" exitCode=0 Mar 07 07:01:55 crc kubenswrapper[4738]: I0307 07:01:55.980338 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066"} Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.008454 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.027757 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.050692 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.071042 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.089226 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.117681 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.136813 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.154489 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.178688 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.199434 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.213950 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.231368 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.253995 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.277419 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.293001 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.305819 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.322218 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:56Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.992673 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c725cf7-39ed-4a27-abf6-8e8346cc6ebd" containerID="96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a" exitCode=0 Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.992787 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerDied","Data":"96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a"} Mar 07 07:01:56 crc kubenswrapper[4738]: I0307 07:01:56.999277 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.015175 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.036575 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.050030 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.061527 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.085868 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.124655 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.156576 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.177914 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.190938 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.208176 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.222599 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.235680 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.245484 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.260109 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.285500 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.301215 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.314567 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:57Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.384747 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.384901 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.384905 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:57 crc kubenswrapper[4738]: E0307 07:01:57.386380 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:57 crc kubenswrapper[4738]: I0307 07:01:57.384996 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:57 crc kubenswrapper[4738]: E0307 07:01:57.385401 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:57 crc kubenswrapper[4738]: E0307 07:01:57.386682 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:57 crc kubenswrapper[4738]: E0307 07:01:57.386333 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.009732 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" event={"ID":"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd","Type":"ContainerStarted","Data":"8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239"} Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.012964 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-mbtvs" event={"ID":"ed4db713-ac09-4c8e-ab4c-f9031c78d476","Type":"ContainerStarted","Data":"3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a"} Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.039010 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.057119 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.071169 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.084007 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.101523 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.114250 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.137120 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.154915 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.168371 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.182536 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.205212 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.222676 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.241923 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.261116 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.279666 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.295901 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.309966 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.331003 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.347840 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.359506 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.375313 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.398563 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.414726 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.431771 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.447652 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.466191 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.480513 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.493723 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.513096 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.531324 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.548144 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.565097 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.578098 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:58 crc kubenswrapper[4738]: E0307 07:01:58.580861 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:01:58 crc kubenswrapper[4738]: I0307 07:01:58.594641 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:58Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.022893 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd"} Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.023374 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.023535 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.023620 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.050182 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.060714 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.062226 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.067194 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.082651 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.097470 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.114555 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.130294 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.149063 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.164259 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.189298 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.203552 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.216928 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.234194 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.258590 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.274048 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.294056 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.312120 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.329211 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.350726 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.365658 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.378207 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.385417 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.385464 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.385496 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.385468 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:01:59 crc kubenswrapper[4738]: E0307 07:01:59.385614 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:01:59 crc kubenswrapper[4738]: E0307 07:01:59.385775 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:01:59 crc kubenswrapper[4738]: E0307 07:01:59.385835 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:01:59 crc kubenswrapper[4738]: E0307 07:01:59.385887 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.400335 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.417878 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.445284 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.465571 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.481834 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.499783 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.512551 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.529888 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.547505 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.560491 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.574950 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.594294 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.606791 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:01:59 crc kubenswrapper[4738]: I0307 07:01:59.623531 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:01:59Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.804024 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.804077 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.804086 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.804109 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.804122 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:00Z","lastTransitionTime":"2026-03-07T07:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.817617 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:00Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.822067 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.822113 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.822125 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.822167 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.822179 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:00Z","lastTransitionTime":"2026-03-07T07:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.834376 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:00Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.839317 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.839374 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.839388 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.839412 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.839429 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:00Z","lastTransitionTime":"2026-03-07T07:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.852769 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:00Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.857510 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.857558 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.857573 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.857594 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.857610 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:00Z","lastTransitionTime":"2026-03-07T07:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.871483 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:00Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.881222 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.881272 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.881284 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.881304 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:00 crc kubenswrapper[4738]: I0307 07:02:00.881317 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:00Z","lastTransitionTime":"2026-03-07T07:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.893807 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:00Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:00 crc kubenswrapper[4738]: E0307 07:02:00.893963 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:01 crc kubenswrapper[4738]: I0307 07:02:01.384742 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:01 crc kubenswrapper[4738]: I0307 07:02:01.384736 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:01 crc kubenswrapper[4738]: E0307 07:02:01.385403 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:01 crc kubenswrapper[4738]: I0307 07:02:01.384818 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:01 crc kubenswrapper[4738]: I0307 07:02:01.384792 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:01 crc kubenswrapper[4738]: E0307 07:02:01.385612 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:01 crc kubenswrapper[4738]: E0307 07:02:01.385653 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:01 crc kubenswrapper[4738]: E0307 07:02:01.385736 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.037135 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/0.log" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.040782 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd" exitCode=1 Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.040846 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd"} Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.041630 4738 scope.go:117] "RemoveContainer" containerID="e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.065577 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.081746 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.098331 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.114679 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.136036 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.155658 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.173508 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.188981 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.211329 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:01Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 07:02:01.365325 6743 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 07:02:01.365357 6743 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 07:02:01.365364 6743 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 07:02:01.365387 6743 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0307 07:02:01.365410 6743 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 07:02:01.365423 6743 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 07:02:01.365464 6743 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 07:02:01.365454 6743 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 07:02:01.365495 6743 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 07:02:01.365446 6743 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 07:02:01.365522 6743 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 07:02:01.365537 6743 factory.go:656] Stopping watch factory\\\\nI0307 07:02:01.365555 6743 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.229566 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.245202 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.257826 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.270076 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.282765 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.293738 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.305839 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.320143 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.403738 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.422096 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.437300 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.490122 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.516384 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.548488 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.564841 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.590828 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.607336 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.621246 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.636516 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.651054 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.664378 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.680506 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.708581 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:01Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 07:02:01.365325 6743 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 07:02:01.365357 6743 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 07:02:01.365364 6743 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 07:02:01.365387 6743 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0307 07:02:01.365410 6743 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 07:02:01.365423 6743 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 07:02:01.365464 6743 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 07:02:01.365454 6743 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 07:02:01.365495 6743 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 07:02:01.365446 6743 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 07:02:01.365522 6743 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 07:02:01.365537 6743 factory.go:656] Stopping watch factory\\\\nI0307 07:02:01.365555 6743 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.724511 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:02 crc kubenswrapper[4738]: I0307 07:02:02.739833 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.046232 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/1.log" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.047041 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/0.log" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.050351 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650" exitCode=1 Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.050397 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650"} Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.050463 4738 scope.go:117] "RemoveContainer" containerID="e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.051509 4738 scope.go:117] "RemoveContainer" containerID="9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.051841 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.066720 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.080965 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.094099 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.114292 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.129816 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.143992 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.158216 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.171347 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.192510 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.209019 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.226422 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.243003 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.254381 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.274372 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.298252 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:01Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 07:02:01.365325 6743 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 07:02:01.365357 6743 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 07:02:01.365364 6743 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 07:02:01.365387 6743 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0307 07:02:01.365410 6743 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 07:02:01.365423 6743 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 07:02:01.365464 6743 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 07:02:01.365454 6743 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 07:02:01.365495 6743 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 07:02:01.365446 6743 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 07:02:01.365522 6743 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 07:02:01.365537 6743 factory.go:656] Stopping watch factory\\\\nI0307 07:02:01.365555 6743 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:02Z\\\",\\\"message\\\":\\\"17.4.219,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.219],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0307 07:02:02.978644 6882 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 07:02:02.979045 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.317397 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.335321 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:03Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.384691 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.384711 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.384809 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:03 crc kubenswrapper[4738]: I0307 07:02:03.384873 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.385014 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.385225 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.385386 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.385496 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:03 crc kubenswrapper[4738]: E0307 07:02:03.582841 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:04 crc kubenswrapper[4738]: I0307 07:02:04.056947 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/1.log" Mar 07 07:02:05 crc kubenswrapper[4738]: I0307 07:02:05.385635 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:05 crc kubenswrapper[4738]: E0307 07:02:05.386260 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:05 crc kubenswrapper[4738]: I0307 07:02:05.385977 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:05 crc kubenswrapper[4738]: E0307 07:02:05.386383 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:05 crc kubenswrapper[4738]: I0307 07:02:05.385996 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:05 crc kubenswrapper[4738]: E0307 07:02:05.386470 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:05 crc kubenswrapper[4738]: I0307 07:02:05.385977 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:05 crc kubenswrapper[4738]: E0307 07:02:05.386556 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:07 crc kubenswrapper[4738]: I0307 07:02:07.470618 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:07 crc kubenswrapper[4738]: E0307 07:02:07.470783 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:07 crc kubenswrapper[4738]: I0307 07:02:07.470882 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:07 crc kubenswrapper[4738]: I0307 07:02:07.470889 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:07 crc kubenswrapper[4738]: I0307 07:02:07.471014 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:07 crc kubenswrapper[4738]: E0307 07:02:07.471028 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:07 crc kubenswrapper[4738]: E0307 07:02:07.471253 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:07 crc kubenswrapper[4738]: E0307 07:02:07.471348 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:08 crc kubenswrapper[4738]: E0307 07:02:08.583690 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:09 crc kubenswrapper[4738]: I0307 07:02:09.385430 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:09 crc kubenswrapper[4738]: E0307 07:02:09.385644 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:09 crc kubenswrapper[4738]: I0307 07:02:09.385973 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:09 crc kubenswrapper[4738]: I0307 07:02:09.386007 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:09 crc kubenswrapper[4738]: I0307 07:02:09.386033 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:09 crc kubenswrapper[4738]: E0307 07:02:09.386100 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:09 crc kubenswrapper[4738]: E0307 07:02:09.386293 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:09 crc kubenswrapper[4738]: E0307 07:02:09.386387 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.215388 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.215747 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.215879 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:02:43.215850903 +0000 UTC m=+181.680838444 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.255770 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.255816 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.255827 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.255843 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.255855 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:11Z","lastTransitionTime":"2026-03-07T07:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.271785 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:11Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.277393 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.277465 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.277483 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.277511 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.277533 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:11Z","lastTransitionTime":"2026-03-07T07:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.298982 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:11Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.304236 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.304296 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.304318 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.304346 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.304365 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:11Z","lastTransitionTime":"2026-03-07T07:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.323515 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:11Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.329028 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.329074 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.329085 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.329106 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.329119 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:11Z","lastTransitionTime":"2026-03-07T07:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.345208 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:11Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.349677 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.349737 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.349748 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.349764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.349795 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:11Z","lastTransitionTime":"2026-03-07T07:02:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.369235 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:11Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.369345 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.385461 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.385567 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.385624 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:11 crc kubenswrapper[4738]: I0307 07:02:11.385651 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.385585 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.385780 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.385973 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:11 crc kubenswrapper[4738]: E0307 07:02:11.386101 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.405453 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.423626 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.444992 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.465365 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.484284 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.509739 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e8f7a8b396648b4c33f5e44649910d2dd70834798104c30152c28fc2987d6edd\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:01Z\\\",\\\"message\\\":\\\"ice/v1/apis/informers/externalversions/factory.go:140\\\\nI0307 07:02:01.365325 6743 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0307 07:02:01.365357 6743 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0307 07:02:01.365364 6743 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0307 07:02:01.365387 6743 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0307 07:02:01.365410 6743 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0307 07:02:01.365423 6743 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0307 07:02:01.365449 6743 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0307 07:02:01.365464 6743 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0307 07:02:01.365454 6743 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0307 07:02:01.365495 6743 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0307 07:02:01.365446 6743 handler.go:208] Removed *v1.Node event handler 2\\\\nI0307 07:02:01.365522 6743 handler.go:208] Removed *v1.Node event handler 7\\\\nI0307 07:02:01.365537 6743 factory.go:656] Stopping watch factory\\\\nI0307 07:02:01.365555 6743 ovnkube.go:599] Stopped ovnkube\\\\nI0307 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:58Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:02Z\\\",\\\"message\\\":\\\"17.4.219,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.219],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0307 07:02:02.978644 6882 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 07:02:02.979045 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.527481 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.542897 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.555782 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.571443 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.591839 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.605924 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.630280 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.645945 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.662647 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.678961 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:12 crc kubenswrapper[4738]: I0307 07:02:12.692380 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:12Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:13 crc kubenswrapper[4738]: I0307 07:02:13.385451 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:13 crc kubenswrapper[4738]: I0307 07:02:13.385561 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:13 crc kubenswrapper[4738]: I0307 07:02:13.385642 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:13 crc kubenswrapper[4738]: I0307 07:02:13.385690 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:13 crc kubenswrapper[4738]: E0307 07:02:13.386237 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:13 crc kubenswrapper[4738]: E0307 07:02:13.386593 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:13 crc kubenswrapper[4738]: E0307 07:02:13.386807 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:13 crc kubenswrapper[4738]: E0307 07:02:13.387226 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:13 crc kubenswrapper[4738]: I0307 07:02:13.401697 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 07 07:02:13 crc kubenswrapper[4738]: E0307 07:02:13.586273 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:15 crc kubenswrapper[4738]: I0307 07:02:15.384886 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:15 crc kubenswrapper[4738]: I0307 07:02:15.384948 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:15 crc kubenswrapper[4738]: I0307 07:02:15.385011 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:15 crc kubenswrapper[4738]: I0307 07:02:15.385183 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:15 crc kubenswrapper[4738]: E0307 07:02:15.385329 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:15 crc kubenswrapper[4738]: E0307 07:02:15.385453 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:15 crc kubenswrapper[4738]: E0307 07:02:15.385554 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:15 crc kubenswrapper[4738]: E0307 07:02:15.385634 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:15 crc kubenswrapper[4738]: I0307 07:02:15.397627 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 07 07:02:17 crc kubenswrapper[4738]: I0307 07:02:17.385257 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:17 crc kubenswrapper[4738]: I0307 07:02:17.385342 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:17 crc kubenswrapper[4738]: I0307 07:02:17.385378 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:17 crc kubenswrapper[4738]: E0307 07:02:17.386691 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:17 crc kubenswrapper[4738]: E0307 07:02:17.386842 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:17 crc kubenswrapper[4738]: E0307 07:02:17.386869 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:17 crc kubenswrapper[4738]: I0307 07:02:17.385474 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:17 crc kubenswrapper[4738]: E0307 07:02:17.387537 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.386404 4738 scope.go:117] "RemoveContainer" containerID="9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.408937 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.430867 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.453947 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.471030 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.498507 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.522494 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:02Z\\\",\\\"message\\\":\\\"17.4.219,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.219],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0307 07:02:02.978644 6882 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 07:02:02.979045 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.538194 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.554982 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.574055 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: E0307 07:02:18.588125 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.591801 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.610180 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.629969 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.648708 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.661066 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.689617 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.709548 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.729312 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.748260 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:18 crc kubenswrapper[4738]: I0307 07:02:18.772057 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:18Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.124029 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/1.log" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.127080 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553"} Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.127664 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.140507 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.151870 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.164501 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.175462 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.186337 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.202508 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.220814 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.235686 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.250982 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.266460 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.278138 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.301902 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.321525 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.331232 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.342253 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.357762 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:02Z\\\",\\\"message\\\":\\\"17.4.219,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.219],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0307 07:02:02.978644 6882 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 07:02:02.979045 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.369089 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.385518 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.385576 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.385576 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.385580 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:19 crc kubenswrapper[4738]: E0307 07:02:19.385677 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:19 crc kubenswrapper[4738]: E0307 07:02:19.385827 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:19 crc kubenswrapper[4738]: E0307 07:02:19.385914 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:19 crc kubenswrapper[4738]: E0307 07:02:19.386005 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.387805 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:19 crc kubenswrapper[4738]: I0307 07:02:19.402761 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:19Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.134981 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/2.log" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.136047 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/1.log" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.140713 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" exitCode=1 Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.140774 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553"} Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.140828 4738 scope.go:117] "RemoveContainer" containerID="9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.141868 4738 scope.go:117] "RemoveContainer" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" Mar 07 07:02:20 crc kubenswrapper[4738]: E0307 07:02:20.142204 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.160430 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.181293 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.202752 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.225528 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.250389 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.269983 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.285883 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.302191 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.317376 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.344430 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.366541 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.383570 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.399472 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.425014 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c65d5a63bb09c29d3f15878b9bc176bf09895ad9f9f479ba55db112cda4d650\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:02Z\\\",\\\"message\\\":\\\"17.4.219,Type:ClusterIP,ExternalIPs:[],SessionAffinity:None,LoadBalancerIP:,LoadBalancerSourceRanges:[],ExternalName:,ExternalTrafficPolicy:,HealthCheckNodePort:0,PublishNotReadyAddresses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.219],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI0307 07:02:02.978644 6882 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-apiserver-operator/metrics]} name:Service_openshift-apiserver-operator/metrics_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.38:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {8b82f026-5975-4a1b-bb18-08d5d51147ec}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0307 07:02:02.979045 6882 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.439573 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.456955 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.470652 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.483808 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:20 crc kubenswrapper[4738]: I0307 07:02:20.505866 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:20Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.146500 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/2.log" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.151128 4738 scope.go:117] "RemoveContainer" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.151344 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.171768 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.191893 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.212495 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.232580 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.254483 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.282186 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.326962 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.345954 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.363724 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.383231 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.385423 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.385499 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.385503 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.385550 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.385578 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.385746 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.385869 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.385947 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.401284 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.428442 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.443287 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.463317 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.485266 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.500696 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.520137 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.547094 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.547387 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:25.547339594 +0000 UTC m=+224.012326975 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.547510 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.547582 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.547679 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.547774 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.547789 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:03:25.547760375 +0000 UTC m=+224.012747726 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.547894 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:03:25.547874708 +0000 UTC m=+224.012862029 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.551684 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.568373 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.632646 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.632704 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.632713 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.632735 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.632749 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:21Z","lastTransitionTime":"2026-03-07T07:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.648515 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.648643 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.648785 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.648837 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.648851 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.648913 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:03:25.648893015 +0000 UTC m=+224.113880336 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.648797 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.649064 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.649072 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.649096 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:03:25.64908985 +0000 UTC m=+224.114077171 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.652386 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.656862 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.656894 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.656903 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.656920 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.656931 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:21Z","lastTransitionTime":"2026-03-07T07:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.673046 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.677836 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.677874 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.677883 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.677900 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.677911 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:21Z","lastTransitionTime":"2026-03-07T07:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.690751 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.696446 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.696492 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.696506 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.696526 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.696537 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:21Z","lastTransitionTime":"2026-03-07T07:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.714990 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.720356 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.720419 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.720435 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.720468 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:21 crc kubenswrapper[4738]: I0307 07:02:21.720488 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:21Z","lastTransitionTime":"2026-03-07T07:02:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.736771 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:21Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:21 crc kubenswrapper[4738]: E0307 07:02:21.736912 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.410177 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.432064 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.451306 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.468560 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.483197 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.498048 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.531390 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.555053 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.581293 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.602211 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.617687 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.643397 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.666250 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.686867 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.705658 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.729464 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.762647 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.781684 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:22 crc kubenswrapper[4738]: I0307 07:02:22.803868 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:22Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:23 crc kubenswrapper[4738]: I0307 07:02:23.384851 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:23 crc kubenswrapper[4738]: I0307 07:02:23.384890 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:23 crc kubenswrapper[4738]: E0307 07:02:23.385011 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:23 crc kubenswrapper[4738]: I0307 07:02:23.385051 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:23 crc kubenswrapper[4738]: I0307 07:02:23.385135 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:23 crc kubenswrapper[4738]: E0307 07:02:23.385266 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:23 crc kubenswrapper[4738]: E0307 07:02:23.385352 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:23 crc kubenswrapper[4738]: E0307 07:02:23.385483 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:23 crc kubenswrapper[4738]: E0307 07:02:23.589553 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:25 crc kubenswrapper[4738]: I0307 07:02:25.385366 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:25 crc kubenswrapper[4738]: I0307 07:02:25.385393 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:25 crc kubenswrapper[4738]: I0307 07:02:25.385424 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:25 crc kubenswrapper[4738]: E0307 07:02:25.385834 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:25 crc kubenswrapper[4738]: I0307 07:02:25.385416 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:25 crc kubenswrapper[4738]: E0307 07:02:25.385573 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:25 crc kubenswrapper[4738]: E0307 07:02:25.385904 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:25 crc kubenswrapper[4738]: E0307 07:02:25.385984 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:27 crc kubenswrapper[4738]: I0307 07:02:27.385033 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:27 crc kubenswrapper[4738]: I0307 07:02:27.385122 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:27 crc kubenswrapper[4738]: I0307 07:02:27.385143 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:27 crc kubenswrapper[4738]: E0307 07:02:27.385206 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:27 crc kubenswrapper[4738]: E0307 07:02:27.385282 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:27 crc kubenswrapper[4738]: I0307 07:02:27.385371 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:27 crc kubenswrapper[4738]: E0307 07:02:27.385438 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:27 crc kubenswrapper[4738]: E0307 07:02:27.385495 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:28 crc kubenswrapper[4738]: E0307 07:02:28.590861 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:29 crc kubenswrapper[4738]: I0307 07:02:29.385269 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:29 crc kubenswrapper[4738]: I0307 07:02:29.385325 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:29 crc kubenswrapper[4738]: I0307 07:02:29.385365 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:29 crc kubenswrapper[4738]: I0307 07:02:29.385325 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:29 crc kubenswrapper[4738]: E0307 07:02:29.385417 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:29 crc kubenswrapper[4738]: E0307 07:02:29.385610 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:29 crc kubenswrapper[4738]: E0307 07:02:29.385634 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:29 crc kubenswrapper[4738]: E0307 07:02:29.385677 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.385462 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.385534 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.385534 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.385561 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.385720 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.385944 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.386128 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.386237 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.870115 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.870238 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.870277 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.870309 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.870330 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:31Z","lastTransitionTime":"2026-03-07T07:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.893834 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:31Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.900321 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.900387 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.900403 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.900427 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.900445 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:31Z","lastTransitionTime":"2026-03-07T07:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.922017 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:31Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.926744 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.926802 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.926815 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.926835 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.926852 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:31Z","lastTransitionTime":"2026-03-07T07:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.945742 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:31Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.950567 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.950611 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.950622 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.950641 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.950654 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:31Z","lastTransitionTime":"2026-03-07T07:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.971297 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:31Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.976878 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.976963 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.976983 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.977015 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:31 crc kubenswrapper[4738]: I0307 07:02:31.977035 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:31Z","lastTransitionTime":"2026-03-07T07:02:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.996411 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:31Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:31 crc kubenswrapper[4738]: E0307 07:02:31.996667 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.406292 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.429488 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.452893 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.474411 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.494097 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.529329 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.551335 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.574070 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.596679 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.610076 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.625733 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.640997 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.656108 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.669480 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.680136 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.692514 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.714057 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.724844 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:32 crc kubenswrapper[4738]: I0307 07:02:32.736338 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:32Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:33 crc kubenswrapper[4738]: I0307 07:02:33.385631 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:33 crc kubenswrapper[4738]: E0307 07:02:33.385860 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:33 crc kubenswrapper[4738]: I0307 07:02:33.386593 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:33 crc kubenswrapper[4738]: I0307 07:02:33.386675 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:33 crc kubenswrapper[4738]: I0307 07:02:33.387018 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:33 crc kubenswrapper[4738]: E0307 07:02:33.387268 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:33 crc kubenswrapper[4738]: E0307 07:02:33.387442 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:33 crc kubenswrapper[4738]: E0307 07:02:33.387624 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:33 crc kubenswrapper[4738]: E0307 07:02:33.592656 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:35 crc kubenswrapper[4738]: I0307 07:02:35.385269 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:35 crc kubenswrapper[4738]: I0307 07:02:35.385355 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:35 crc kubenswrapper[4738]: I0307 07:02:35.385269 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:35 crc kubenswrapper[4738]: E0307 07:02:35.385478 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:35 crc kubenswrapper[4738]: I0307 07:02:35.385596 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:35 crc kubenswrapper[4738]: E0307 07:02:35.385659 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:35 crc kubenswrapper[4738]: E0307 07:02:35.385756 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:35 crc kubenswrapper[4738]: E0307 07:02:35.385856 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:36 crc kubenswrapper[4738]: I0307 07:02:36.386652 4738 scope.go:117] "RemoveContainer" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" Mar 07 07:02:36 crc kubenswrapper[4738]: E0307 07:02:36.386890 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:37 crc kubenswrapper[4738]: I0307 07:02:37.384562 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:37 crc kubenswrapper[4738]: I0307 07:02:37.384674 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:37 crc kubenswrapper[4738]: I0307 07:02:37.384727 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:37 crc kubenswrapper[4738]: E0307 07:02:37.384702 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:37 crc kubenswrapper[4738]: I0307 07:02:37.384928 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:37 crc kubenswrapper[4738]: E0307 07:02:37.384927 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:37 crc kubenswrapper[4738]: E0307 07:02:37.385145 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:37 crc kubenswrapper[4738]: E0307 07:02:37.385284 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:38 crc kubenswrapper[4738]: E0307 07:02:38.594279 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:39 crc kubenswrapper[4738]: I0307 07:02:39.385332 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:39 crc kubenswrapper[4738]: I0307 07:02:39.385346 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:39 crc kubenswrapper[4738]: E0307 07:02:39.385516 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:39 crc kubenswrapper[4738]: I0307 07:02:39.385486 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:39 crc kubenswrapper[4738]: I0307 07:02:39.385370 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:39 crc kubenswrapper[4738]: E0307 07:02:39.385635 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:39 crc kubenswrapper[4738]: E0307 07:02:39.385729 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:39 crc kubenswrapper[4738]: E0307 07:02:39.385918 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.238649 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/0.log" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.238712 4738 generic.go:334] "Generic (PLEG): container finished" podID="c0a91659-d53f-4694-82a7-8c66445ab4f5" containerID="a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f" exitCode=1 Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.238751 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerDied","Data":"a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f"} Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.239237 4738 scope.go:117] "RemoveContainer" containerID="a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.268689 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.293450 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.311685 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.335311 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.354894 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.384457 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.384929 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:41 crc kubenswrapper[4738]: E0307 07:02:41.385142 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.385273 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.385519 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:41 crc kubenswrapper[4738]: E0307 07:02:41.385613 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.385646 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:41 crc kubenswrapper[4738]: E0307 07:02:41.385703 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:41 crc kubenswrapper[4738]: E0307 07:02:41.385833 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.397722 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.414865 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.432562 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.448447 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.473097 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.497634 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.516957 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.536595 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.556334 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.579341 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.605834 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.627258 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:41 crc kubenswrapper[4738]: I0307 07:02:41.645596 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:41Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.245467 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/0.log" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.245560 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerStarted","Data":"13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5"} Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.264429 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.283707 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.299660 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.314269 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.341466 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.357728 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.359049 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.359109 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.359135 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.359207 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.359238 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:42Z","lastTransitionTime":"2026-03-07T07:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.373435 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.374758 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.377340 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.377391 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.377411 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.377436 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.377454 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:42Z","lastTransitionTime":"2026-03-07T07:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.385313 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.390519 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.394679 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.394764 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.394797 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.394826 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.394856 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:42Z","lastTransitionTime":"2026-03-07T07:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.401721 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.409947 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.413937 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.413995 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.414010 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.414032 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.414047 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:42Z","lastTransitionTime":"2026-03-07T07:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.422084 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.428795 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.433286 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.433359 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.433372 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.433388 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.433399 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:42Z","lastTransitionTime":"2026-03-07T07:02:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.438760 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.446798 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: E0307 07:02:42.446975 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.453667 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.467849 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.500118 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.518721 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.538744 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.554615 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.569919 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.586853 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.600429 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.615974 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.634829 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.648347 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.663004 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.681201 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.700319 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.714017 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.739537 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.761651 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.782134 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.797541 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.811001 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.830994 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.851482 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.873039 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.890037 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.916023 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:42 crc kubenswrapper[4738]: I0307 07:02:42.951474 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:42Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:43 crc kubenswrapper[4738]: I0307 07:02:43.237776 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.238024 4738 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.238139 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs podName:ba7ca967-58f7-4944-81d8-7bb8957707ad nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.238110873 +0000 UTC m=+245.703098234 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs") pod "network-metrics-daemon-qpkbn" (UID: "ba7ca967-58f7-4944-81d8-7bb8957707ad") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 07:02:43 crc kubenswrapper[4738]: I0307 07:02:43.385630 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:43 crc kubenswrapper[4738]: I0307 07:02:43.385688 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:43 crc kubenswrapper[4738]: I0307 07:02:43.385728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:43 crc kubenswrapper[4738]: I0307 07:02:43.385728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.385844 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.385986 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.386085 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.386302 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:43 crc kubenswrapper[4738]: E0307 07:02:43.596106 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:45 crc kubenswrapper[4738]: I0307 07:02:45.384816 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:45 crc kubenswrapper[4738]: I0307 07:02:45.384853 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:45 crc kubenswrapper[4738]: I0307 07:02:45.384977 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:45 crc kubenswrapper[4738]: I0307 07:02:45.384992 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:45 crc kubenswrapper[4738]: E0307 07:02:45.385428 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:45 crc kubenswrapper[4738]: E0307 07:02:45.385659 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:45 crc kubenswrapper[4738]: E0307 07:02:45.386005 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:45 crc kubenswrapper[4738]: E0307 07:02:45.386127 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:47 crc kubenswrapper[4738]: I0307 07:02:47.384912 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:47 crc kubenswrapper[4738]: I0307 07:02:47.384912 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:47 crc kubenswrapper[4738]: I0307 07:02:47.385042 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:47 crc kubenswrapper[4738]: I0307 07:02:47.385122 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:47 crc kubenswrapper[4738]: E0307 07:02:47.385365 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:47 crc kubenswrapper[4738]: E0307 07:02:47.385540 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:47 crc kubenswrapper[4738]: E0307 07:02:47.385649 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:47 crc kubenswrapper[4738]: E0307 07:02:47.385742 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:48 crc kubenswrapper[4738]: E0307 07:02:48.597431 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:49 crc kubenswrapper[4738]: I0307 07:02:49.385098 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:49 crc kubenswrapper[4738]: I0307 07:02:49.385208 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:49 crc kubenswrapper[4738]: I0307 07:02:49.385113 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:49 crc kubenswrapper[4738]: I0307 07:02:49.385116 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:49 crc kubenswrapper[4738]: E0307 07:02:49.385371 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:49 crc kubenswrapper[4738]: E0307 07:02:49.385574 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:49 crc kubenswrapper[4738]: E0307 07:02:49.385664 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:49 crc kubenswrapper[4738]: E0307 07:02:49.385770 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:51 crc kubenswrapper[4738]: I0307 07:02:51.385667 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:51 crc kubenswrapper[4738]: I0307 07:02:51.385688 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:51 crc kubenswrapper[4738]: I0307 07:02:51.385796 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:51 crc kubenswrapper[4738]: I0307 07:02:51.385868 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:51 crc kubenswrapper[4738]: E0307 07:02:51.386068 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:51 crc kubenswrapper[4738]: E0307 07:02:51.386271 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:51 crc kubenswrapper[4738]: E0307 07:02:51.386547 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:51 crc kubenswrapper[4738]: E0307 07:02:51.386623 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:51 crc kubenswrapper[4738]: I0307 07:02:51.387869 4738 scope.go:117] "RemoveContainer" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.287587 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/2.log" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.291629 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.292040 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.319513 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.335403 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.349251 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.360197 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.369493 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.381285 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.389857 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.411997 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.423746 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.434204 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.449695 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.462572 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.481933 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.482893 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.482923 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.482932 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.482948 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.482957 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:52Z","lastTransitionTime":"2026-03-07T07:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.493827 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.495400 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.498828 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.498869 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.498879 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.498895 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.498907 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:52Z","lastTransitionTime":"2026-03-07T07:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.509781 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.511528 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.521561 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.521610 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.521620 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.521639 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.521651 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:52Z","lastTransitionTime":"2026-03-07T07:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.529775 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.554680 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570760 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570843 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570904 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570924 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570950 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.570976 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:52Z","lastTransitionTime":"2026-03-07T07:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.591861 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.595896 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.595942 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.595951 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.595972 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.595982 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:02:52Z","lastTransitionTime":"2026-03-07T07:02:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.600116 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.608317 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: E0307 07:02:52.608440 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.611754 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.626617 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.637113 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.659406 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.676302 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.694643 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.710719 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.724990 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.740841 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.756055 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.770440 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.782252 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.799865 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.825347 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.838706 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.853846 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.869444 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.886523 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.900905 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:52 crc kubenswrapper[4738]: I0307 07:02:52.915575 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:52Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.299423 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/3.log" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.300683 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/2.log" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.305582 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" exitCode=1 Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.305666 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.305722 4738 scope.go:117] "RemoveContainer" containerID="4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.307117 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.307614 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.324003 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.340662 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.355536 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.371341 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.385106 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.385339 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.385493 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.385491 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.385744 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.385992 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.386003 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.386056 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.386583 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.412367 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.429714 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.455042 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.479326 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.499427 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.520703 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.538380 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.556474 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.577797 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.594881 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: E0307 07:02:53.598338 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.608800 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.628069 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.670385 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4ba9fdb4e339acf301faf419c3dd6f712f0e2082c1a2903d11c091fe57274553\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:19Z\\\",\\\"message\\\":\\\"penshift-multus/multus-54cnw\\\\nI0307 07:02:19.295238 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295245 7079 ovn.go:134] Ensuring zone local for Pod openshift-multus/multus-54cnw in node crc\\\\nI0307 07:02:19.295250 7079 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-54cnw after 0 failed attempt(s)\\\\nI0307 07:02:19.295256 7079 default_network_controller.go:776] Recording success event on pod openshift-multus/multus-54cnw\\\\nI0307 07:02:19.295262 7079 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nI0307 07:02:19.295270 7079 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-source-55646444c4-trplf\\\\nF0307 07:02:19.295182 7079 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed callin\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"; gw=[10.217.0.1]\\\\nI0307 07:02:52.332542 7429 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0307 07:02:52.332549 7429 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0307 07:02:52.332331 7429 services_controller.go:443] Built service openshift-network-console/networking-console-plugin LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0307 07:02:52.332563 7429 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal err\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:53 crc kubenswrapper[4738]: I0307 07:02:53.687510 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:53Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.312066 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/3.log" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.318660 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:02:54 crc kubenswrapper[4738]: E0307 07:02:54.318962 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.353538 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.380367 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.403000 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.428435 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.445133 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.470898 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.491057 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.509183 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.525816 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.542625 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.562352 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.594024 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"; gw=[10.217.0.1]\\\\nI0307 07:02:52.332542 7429 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0307 07:02:52.332549 7429 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0307 07:02:52.332331 7429 services_controller.go:443] Built service openshift-network-console/networking-console-plugin LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0307 07:02:52.332563 7429 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal err\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.611624 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.630958 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.646655 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.665956 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.683727 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.701578 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:54 crc kubenswrapper[4738]: I0307 07:02:54.715971 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:02:54Z is after 2025-08-24T17:21:41Z" Mar 07 07:02:55 crc kubenswrapper[4738]: I0307 07:02:55.385138 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:55 crc kubenswrapper[4738]: I0307 07:02:55.385223 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:55 crc kubenswrapper[4738]: I0307 07:02:55.385344 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:55 crc kubenswrapper[4738]: I0307 07:02:55.385744 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:55 crc kubenswrapper[4738]: E0307 07:02:55.385721 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:55 crc kubenswrapper[4738]: E0307 07:02:55.385917 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:55 crc kubenswrapper[4738]: E0307 07:02:55.386066 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:55 crc kubenswrapper[4738]: E0307 07:02:55.386337 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:57 crc kubenswrapper[4738]: I0307 07:02:57.384729 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:57 crc kubenswrapper[4738]: I0307 07:02:57.384817 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:57 crc kubenswrapper[4738]: E0307 07:02:57.384921 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:57 crc kubenswrapper[4738]: I0307 07:02:57.384833 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:57 crc kubenswrapper[4738]: E0307 07:02:57.385077 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:57 crc kubenswrapper[4738]: E0307 07:02:57.385252 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:02:57 crc kubenswrapper[4738]: I0307 07:02:57.387144 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:57 crc kubenswrapper[4738]: E0307 07:02:57.387565 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:58 crc kubenswrapper[4738]: E0307 07:02:58.600059 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:02:59 crc kubenswrapper[4738]: I0307 07:02:59.384957 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:02:59 crc kubenswrapper[4738]: E0307 07:02:59.385590 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:02:59 crc kubenswrapper[4738]: I0307 07:02:59.385223 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:02:59 crc kubenswrapper[4738]: I0307 07:02:59.385061 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:02:59 crc kubenswrapper[4738]: E0307 07:02:59.385726 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:02:59 crc kubenswrapper[4738]: I0307 07:02:59.385297 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:02:59 crc kubenswrapper[4738]: E0307 07:02:59.385915 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:02:59 crc kubenswrapper[4738]: E0307 07:02:59.386090 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:01 crc kubenswrapper[4738]: I0307 07:03:01.385581 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:01 crc kubenswrapper[4738]: I0307 07:03:01.385632 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:01 crc kubenswrapper[4738]: I0307 07:03:01.385596 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:01 crc kubenswrapper[4738]: I0307 07:03:01.385700 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:01 crc kubenswrapper[4738]: E0307 07:03:01.385977 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:01 crc kubenswrapper[4738]: E0307 07:03:01.386141 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:01 crc kubenswrapper[4738]: E0307 07:03:01.386477 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:01 crc kubenswrapper[4738]: E0307 07:03:01.386607 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.411755 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.430141 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.445659 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-mbtvs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ed4db713-ac09-4c8e-ab4c-f9031c78d476\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3f001997b9089af2a166194dce59e28b321837dea1778548645254137964ff1a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zllrn\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-mbtvs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.465359 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-54cnw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c0a91659-d53f-4694-82a7-8c66445ab4f5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:02:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:41Z\\\",\\\"message\\\":\\\"2026-03-07T07:01:55+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9\\\\n2026-03-07T07:01:55+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_361ae1c7-5a86-4b07-98e7-6cb89a5e3fd9 to /host/opt/cni/bin/\\\\n2026-03-07T07:01:56Z [verbose] multus-daemon started\\\\n2026-03-07T07:01:56Z [verbose] Readiness Indicator file check\\\\n2026-03-07T07:02:41Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:02:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-622rm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-54cnw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.496598 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0e3f9734-9fb5-4b90-9268-888bc377406e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-07T07:02:52Z\\\",\\\"message\\\":\\\"; gw=[10.217.0.1]\\\\nI0307 07:02:52.332542 7429 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-node-identity/network-node-identity-vrzqb after 0 failed attempt(s)\\\\nI0307 07:02:52.332549 7429 default_network_controller.go:776] Recording success event on pod openshift-network-node-identity/network-node-identity-vrzqb\\\\nI0307 07:02:52.332331 7429 services_controller.go:443] Built service openshift-network-console/networking-console-plugin LB cluster-wide configs for network=default: []services.lbConfig{services.lbConfig{vips:[]string{\\\\\\\"10.217.4.246\\\\\\\"}, protocol:\\\\\\\"TCP\\\\\\\", inport:9443, clusterEndpoints:services.lbEndpoints{Port:0, V4IPs:[]string(nil), V6IPs:[]string(nil)}, nodeEndpoints:map[string]services.lbEndpoints{}, externalTrafficLocal:false, internalTrafficLocal:false, hasNodePort:false}}\\\\nF0307 07:02:52.332563 7429 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal err\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:02:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-v46jv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jh7s7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.514453 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0211010e-3f22-4ac1-a9a4-bccb4552a2b6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbb79844e2df1aff8a469d3f165a47232f3d2148fc8adac0f132e5639bf8b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bf8a2f1630b94ac40afc5d4c2938e39b7b990f5323e72ea991acd48ac9228d97\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zd2z5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-hlnrk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.535112 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"40d74f9e-a54e-4834-a85b-03611252a8e4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e0203bafcccfc61dd0d16c23352854d27c2bc30ba18b2f09347e8bd1e6e79823\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5de91699bd897199f653c44a162402eb800fbc7463af9760b4ae47a245c93b7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09010390ee66950631d32211f2bfe655b213fdf5f9612d4a73aa57430ca968f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd24856bae44f29b14d5834d10fcdfe2834e76756725636499a167d1e5d69ba8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.552860 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b1099e7a-8864-4149-980e-c91ac87f5cd6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://697506c1286108c2dedc27fda25723fdcf24bb4f78ffc7a57fbf8c7e291f3522\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3e5582169f464f864529ac9b03d80756c3214cdb7bb324b3861060cfbb27390f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.571859 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c61088f1-3d39-43ba-bc86-bf8eccb61166\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ebc4b3c9f8fe720d56b45f2fff2c285864e7854909f89cf011551a7d80815722\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bf7b4f2e4ae3034164d255e1ab80ae4aa10f3043431fdd21afd17c0194fc3b99\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:45Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0307 07:00:15.269407 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0307 07:00:15.272712 1 observer_polling.go:159] Starting file observer\\\\nI0307 07:00:15.274611 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0307 07:00:15.275744 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0307 07:00:42.727995 1 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials\\\\nF0307 07:00:45.715959 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:15Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:00:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a0403a34c44fc04e213d8734bc09b1339c16fed303ce66be6c0c4d1a17794c04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://caf48b03181b9be68af545dd350d90b780fb7693c67532b7849de1b07b3f65b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.586219 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:17Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.600812 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://64108051db360e700a6aae4af979cd05e0f4483e1dafbb40e805d6360313f73e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fwsww\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-t7vcc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.612331 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ba7ca967-58f7-4944-81d8-7bb8957707ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-77lnm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-qpkbn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.634833 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a2f3e3f7-6313-4d09-ac48-1e9fc9bce868\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:00:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8898384ba41189a93635297da969ef6506633bcc58c19a81bcfe0f84e44385e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6b3ad1a7a846a0b462c5db72ce653e2b6853c4104ca83d2689877cf9c8f7aee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://53df8d6de07d9f2a42739302ffc2936923650a9c779e94af2571362f309b81d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://987dc55057157fd3dfc14e3684acab643c5e21c881f2355956525a7dc945f64c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f57a1696d83e766b0f2346f2ff2b358c872bb95e402f08ca5859fffb32cfcaca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://941152e103f53fb606002297b4aaac72740a0ff2dfa5bd46ce1d17bbeea6f081\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://966355b73bf5d6ed862598083cfaba335aac96f38890bd36cef322f1dcb3503f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://935d352cc6c3e731674fe92730212bdd2ec5326c9f0d7d9d8d49979049e6dec6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.649529 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b9b5260-782a-4f85-8709-5ba6857b1340\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T06:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-07T07:00:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0307 07:00:52.113781 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0307 07:00:52.114062 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0307 07:00:52.114800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3247027522/tls.crt::/tmp/serving-cert-3247027522/tls.key\\\\\\\"\\\\nI0307 07:00:52.847498 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0307 07:00:52.852931 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0307 07:00:52.852952 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0307 07:00:52.852975 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0307 07:00:52.852982 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0307 07:00:52.859953 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0307 07:00:52.859989 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859993 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0307 07:00:52.859997 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0307 07:00:52.860000 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0307 07:00:52.860003 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0307 07:00:52.860006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0307 07:00:52.860200 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0307 07:00:52.864009 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-07T07:00:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T06:59:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T06:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T06:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T06:59:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.668886 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f8c9aa7e57205646d4814a5b86f0ed15303c868afe5f41d54599cff4ddee124\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.690727 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0de3835737b006a0816186f5024e8e40f92618c5131dd821b961af329261ee8a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://45d807b508cef7c5c773309bf9b884bc8dc7eee857734674765fd17c899cebf6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.709574 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:18Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2d92f2a4d2a7c1aa70e711bdc8cc75c16c5a4d18305b3902d7f2555ec2c7188\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.728086 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c725cf7-39ed-4a27-abf6-8e8346cc6ebd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8cd1995a28067e56c08d68e8cc2f97bda5e744eb40a2e8ed03edceea30cda239\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f1aaf8297129c7d97385aeaa8f7aca016923c640a6f23bffcd29b354bd9b2366\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d041222d0fc4c3de882be75b48dc88930108b7694768f937c76ac545bc9930dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9116f6648230e0dd955fb036a28d40b54e5648daeb172a77063e115ddc95ae0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:53Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d9c404ff325f3564cd1aae6f626c54a619289c48b0ac577776caa2ec05c51c29\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:54Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2478aef629e3f665116bffe0dd160a0ce0ae936f15292577c7f2cdfdc380066\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:55Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://96e3fe9b2f0254532b9a8ff3b2b61bc0592c5da36f3b46c50a49049c1a82a66a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-07T07:01:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-07T07:01:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zn2s9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-fmp5z\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.742437 4738 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-6tq9s" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d81bc0bc-b25f-4d1d-b384-2e220823aa3a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-07T07:01:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c01fe7b683f5b6dbae4b610bcbf6fbf68f0f76f50895a74c145469d536342b4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-07T07:01:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6rb7w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-07T07:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-6tq9s\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.795658 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.795726 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.795743 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.795771 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.795790 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:02Z","lastTransitionTime":"2026-03-07T07:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.811960 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.817637 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.817723 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.817745 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.817772 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.817794 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:02Z","lastTransitionTime":"2026-03-07T07:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.834513 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.839600 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.839650 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.839666 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.839691 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.839711 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:02Z","lastTransitionTime":"2026-03-07T07:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.860302 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.866309 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.866362 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.866374 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.866398 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.866412 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:02Z","lastTransitionTime":"2026-03-07T07:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.884393 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.889718 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.889783 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.889806 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.889833 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:02 crc kubenswrapper[4738]: I0307 07:03:02.889855 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:02Z","lastTransitionTime":"2026-03-07T07:03:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.904339 4738 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-07T07:03:02Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"4314a625-f101-41ec-bd1c-f79c10f7f811\\\",\\\"systemUUID\\\":\\\"38a340c1-d2f7-4b78-944f-4f5f0a1624aa\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-07T07:03:02Z is after 2025-08-24T17:21:41Z" Mar 07 07:03:02 crc kubenswrapper[4738]: E0307 07:03:02.904576 4738 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 07:03:03 crc kubenswrapper[4738]: I0307 07:03:03.385453 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:03 crc kubenswrapper[4738]: I0307 07:03:03.385483 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:03 crc kubenswrapper[4738]: I0307 07:03:03.385630 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:03 crc kubenswrapper[4738]: I0307 07:03:03.385646 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:03 crc kubenswrapper[4738]: E0307 07:03:03.385986 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:03 crc kubenswrapper[4738]: E0307 07:03:03.386330 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:03 crc kubenswrapper[4738]: E0307 07:03:03.386523 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:03 crc kubenswrapper[4738]: E0307 07:03:03.386595 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:03 crc kubenswrapper[4738]: E0307 07:03:03.602136 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:05 crc kubenswrapper[4738]: I0307 07:03:05.385429 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:05 crc kubenswrapper[4738]: I0307 07:03:05.385501 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:05 crc kubenswrapper[4738]: I0307 07:03:05.385429 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:05 crc kubenswrapper[4738]: E0307 07:03:05.385653 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:05 crc kubenswrapper[4738]: I0307 07:03:05.386023 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:05 crc kubenswrapper[4738]: E0307 07:03:05.386149 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:05 crc kubenswrapper[4738]: E0307 07:03:05.386352 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:05 crc kubenswrapper[4738]: E0307 07:03:05.386459 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:05 crc kubenswrapper[4738]: I0307 07:03:05.386609 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:03:05 crc kubenswrapper[4738]: E0307 07:03:05.387039 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:03:07 crc kubenswrapper[4738]: I0307 07:03:07.384648 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:07 crc kubenswrapper[4738]: I0307 07:03:07.384675 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:07 crc kubenswrapper[4738]: I0307 07:03:07.384834 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:07 crc kubenswrapper[4738]: E0307 07:03:07.384862 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:07 crc kubenswrapper[4738]: E0307 07:03:07.385020 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:07 crc kubenswrapper[4738]: E0307 07:03:07.385238 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:07 crc kubenswrapper[4738]: I0307 07:03:07.384675 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:07 crc kubenswrapper[4738]: E0307 07:03:07.385684 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:08 crc kubenswrapper[4738]: E0307 07:03:08.603752 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:09 crc kubenswrapper[4738]: I0307 07:03:09.385532 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:09 crc kubenswrapper[4738]: I0307 07:03:09.385634 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:09 crc kubenswrapper[4738]: E0307 07:03:09.385701 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:09 crc kubenswrapper[4738]: I0307 07:03:09.385823 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:09 crc kubenswrapper[4738]: I0307 07:03:09.385819 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:09 crc kubenswrapper[4738]: E0307 07:03:09.385981 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:09 crc kubenswrapper[4738]: E0307 07:03:09.386879 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:09 crc kubenswrapper[4738]: E0307 07:03:09.387106 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:11 crc kubenswrapper[4738]: I0307 07:03:11.385339 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:11 crc kubenswrapper[4738]: I0307 07:03:11.385387 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:11 crc kubenswrapper[4738]: I0307 07:03:11.385398 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:11 crc kubenswrapper[4738]: I0307 07:03:11.385514 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:11 crc kubenswrapper[4738]: E0307 07:03:11.385513 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:11 crc kubenswrapper[4738]: E0307 07:03:11.385680 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:11 crc kubenswrapper[4738]: E0307 07:03:11.385751 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:11 crc kubenswrapper[4738]: E0307 07:03:11.385817 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.484101 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-hlnrk" podStartSLOduration=140.48405517 podStartE2EDuration="2m20.48405517s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.462700154 +0000 UTC m=+210.927687485" watchObservedRunningTime="2026-03-07 07:03:12.48405517 +0000 UTC m=+210.949042491" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.518644 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-mbtvs" podStartSLOduration=141.518607635 podStartE2EDuration="2m21.518607635s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.517895596 +0000 UTC m=+210.982882937" watchObservedRunningTime="2026-03-07 07:03:12.518607635 +0000 UTC m=+210.983595006" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.568384 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-54cnw" podStartSLOduration=141.568358243 podStartE2EDuration="2m21.568358243s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.546399081 +0000 UTC m=+211.011386412" watchObservedRunningTime="2026-03-07 07:03:12.568358243 +0000 UTC m=+211.033345574" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.582888 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=79.582856567 podStartE2EDuration="1m19.582856567s" podCreationTimestamp="2026-03-07 07:01:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.56822322 +0000 UTC m=+211.033210531" watchObservedRunningTime="2026-03-07 07:03:12.582856567 +0000 UTC m=+211.047843888" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.597421 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=57.597407502 podStartE2EDuration="57.597407502s" podCreationTimestamp="2026-03-07 07:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.596519239 +0000 UTC m=+211.061506560" watchObservedRunningTime="2026-03-07 07:03:12.597407502 +0000 UTC m=+211.062394823" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.634428 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=59.634408243 podStartE2EDuration="59.634408243s" podCreationTimestamp="2026-03-07 07:02:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.614347921 +0000 UTC m=+211.079335242" watchObservedRunningTime="2026-03-07 07:03:12.634408243 +0000 UTC m=+211.099395564" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.652128 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podStartSLOduration=141.652100392 podStartE2EDuration="2m21.652100392s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.65202245 +0000 UTC m=+211.117009771" watchObservedRunningTime="2026-03-07 07:03:12.652100392 +0000 UTC m=+211.117087723" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.702696 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-fmp5z" podStartSLOduration=141.702671421 podStartE2EDuration="2m21.702671421s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.702106607 +0000 UTC m=+211.167093928" watchObservedRunningTime="2026-03-07 07:03:12.702671421 +0000 UTC m=+211.167658762" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.749088 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6tq9s" podStartSLOduration=141.74906497 podStartE2EDuration="2m21.74906497s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.720335709 +0000 UTC m=+211.185323040" watchObservedRunningTime="2026-03-07 07:03:12.74906497 +0000 UTC m=+211.214052301" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.749563 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=111.749556414 podStartE2EDuration="1m51.749556414s" podCreationTimestamp="2026-03-07 07:01:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.746620906 +0000 UTC m=+211.211608247" watchObservedRunningTime="2026-03-07 07:03:12.749556414 +0000 UTC m=+211.214543745" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.766438 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=107.7664141 podStartE2EDuration="1m47.7664141s" podCreationTimestamp="2026-03-07 07:01:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:12.765330901 +0000 UTC m=+211.230318232" watchObservedRunningTime="2026-03-07 07:03:12.7664141 +0000 UTC m=+211.231401431" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.988032 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.988080 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.988091 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.988110 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 07 07:03:12 crc kubenswrapper[4738]: I0307 07:03:12.988122 4738 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-07T07:03:12Z","lastTransitionTime":"2026-03-07T07:03:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.049928 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8"] Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.050663 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.054323 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.054411 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.054514 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.055133 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.120715 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58ebfe83-5f4a-4986-899d-6b1a744601a2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.120793 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58ebfe83-5f4a-4986-899d-6b1a744601a2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.120829 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58ebfe83-5f4a-4986-899d-6b1a744601a2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.120882 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.120924 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221679 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58ebfe83-5f4a-4986-899d-6b1a744601a2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221744 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58ebfe83-5f4a-4986-899d-6b1a744601a2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221772 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58ebfe83-5f4a-4986-899d-6b1a744601a2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221810 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221841 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.221979 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.222022 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/58ebfe83-5f4a-4986-899d-6b1a744601a2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.224281 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/58ebfe83-5f4a-4986-899d-6b1a744601a2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.239099 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/58ebfe83-5f4a-4986-899d-6b1a744601a2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.259731 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/58ebfe83-5f4a-4986-899d-6b1a744601a2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-vw8h8\" (UID: \"58ebfe83-5f4a-4986-899d-6b1a744601a2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.380432 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.385039 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:13 crc kubenswrapper[4738]: E0307 07:03:13.385699 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.385196 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.385062 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:13 crc kubenswrapper[4738]: E0307 07:03:13.385777 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.385291 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:13 crc kubenswrapper[4738]: E0307 07:03:13.385856 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:13 crc kubenswrapper[4738]: E0307 07:03:13.385919 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:13 crc kubenswrapper[4738]: E0307 07:03:13.606553 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.821347 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 07 07:03:13 crc kubenswrapper[4738]: I0307 07:03:13.835196 4738 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 07:03:14 crc kubenswrapper[4738]: I0307 07:03:14.406471 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" event={"ID":"58ebfe83-5f4a-4986-899d-6b1a744601a2","Type":"ContainerStarted","Data":"022d83fa87e64bf3669b67e6c913f4b14742db618e4d7bc83aeb42138c9b77c7"} Mar 07 07:03:14 crc kubenswrapper[4738]: I0307 07:03:14.406546 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" event={"ID":"58ebfe83-5f4a-4986-899d-6b1a744601a2","Type":"ContainerStarted","Data":"2525cd9ac76bc364dac353de695adf561750ffbd56105ff19d191d13dcb40923"} Mar 07 07:03:14 crc kubenswrapper[4738]: I0307 07:03:14.426777 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-vw8h8" podStartSLOduration=143.426742395 podStartE2EDuration="2m23.426742395s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:14.426108549 +0000 UTC m=+212.891095970" watchObservedRunningTime="2026-03-07 07:03:14.426742395 +0000 UTC m=+212.891729756" Mar 07 07:03:15 crc kubenswrapper[4738]: I0307 07:03:15.386359 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:15 crc kubenswrapper[4738]: I0307 07:03:15.386866 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:15 crc kubenswrapper[4738]: E0307 07:03:15.387001 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:15 crc kubenswrapper[4738]: I0307 07:03:15.387074 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:15 crc kubenswrapper[4738]: E0307 07:03:15.387223 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:15 crc kubenswrapper[4738]: E0307 07:03:15.386860 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:15 crc kubenswrapper[4738]: I0307 07:03:15.387332 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:15 crc kubenswrapper[4738]: E0307 07:03:15.387422 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:17 crc kubenswrapper[4738]: I0307 07:03:17.384935 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:17 crc kubenswrapper[4738]: I0307 07:03:17.384977 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:17 crc kubenswrapper[4738]: I0307 07:03:17.385023 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:17 crc kubenswrapper[4738]: E0307 07:03:17.385112 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:17 crc kubenswrapper[4738]: I0307 07:03:17.385133 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:17 crc kubenswrapper[4738]: E0307 07:03:17.385271 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:17 crc kubenswrapper[4738]: E0307 07:03:17.385437 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:17 crc kubenswrapper[4738]: E0307 07:03:17.385550 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:18 crc kubenswrapper[4738]: E0307 07:03:18.608137 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:19 crc kubenswrapper[4738]: I0307 07:03:19.385440 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:19 crc kubenswrapper[4738]: E0307 07:03:19.385766 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:19 crc kubenswrapper[4738]: I0307 07:03:19.385813 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:19 crc kubenswrapper[4738]: I0307 07:03:19.385907 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:19 crc kubenswrapper[4738]: I0307 07:03:19.385914 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:19 crc kubenswrapper[4738]: E0307 07:03:19.386271 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:19 crc kubenswrapper[4738]: E0307 07:03:19.386492 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:19 crc kubenswrapper[4738]: E0307 07:03:19.386624 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:20 crc kubenswrapper[4738]: I0307 07:03:20.387102 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:03:20 crc kubenswrapper[4738]: E0307 07:03:20.387434 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jh7s7_openshift-ovn-kubernetes(0e3f9734-9fb5-4b90-9268-888bc377406e)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" Mar 07 07:03:21 crc kubenswrapper[4738]: I0307 07:03:21.385499 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:21 crc kubenswrapper[4738]: I0307 07:03:21.385622 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:21 crc kubenswrapper[4738]: I0307 07:03:21.385630 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:21 crc kubenswrapper[4738]: E0307 07:03:21.385779 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:21 crc kubenswrapper[4738]: I0307 07:03:21.386342 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:21 crc kubenswrapper[4738]: E0307 07:03:21.386415 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:21 crc kubenswrapper[4738]: E0307 07:03:21.386598 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:21 crc kubenswrapper[4738]: E0307 07:03:21.386666 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:23 crc kubenswrapper[4738]: I0307 07:03:23.384686 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:23 crc kubenswrapper[4738]: E0307 07:03:23.384950 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:23 crc kubenswrapper[4738]: I0307 07:03:23.384729 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:23 crc kubenswrapper[4738]: E0307 07:03:23.385108 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:23 crc kubenswrapper[4738]: I0307 07:03:23.384734 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:23 crc kubenswrapper[4738]: E0307 07:03:23.385263 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:23 crc kubenswrapper[4738]: I0307 07:03:23.384700 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:23 crc kubenswrapper[4738]: E0307 07:03:23.385371 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:23 crc kubenswrapper[4738]: E0307 07:03:23.610129 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.385416 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.385601 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.385773 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.385973 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.386131 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.386298 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.386403 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.386547 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.583857 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.584031 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:05:27.583996305 +0000 UTC m=+346.048983656 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.584122 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.584218 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.584316 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.584396 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:05:27.584377115 +0000 UTC m=+346.049364466 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.584407 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.584463 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:05:27.584446357 +0000 UTC m=+346.049433708 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.685399 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:25 crc kubenswrapper[4738]: I0307 07:03:25.685506 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685723 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685726 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685755 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685775 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685781 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685798 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685878 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:05:27.685853273 +0000 UTC m=+346.150840634 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:03:25 crc kubenswrapper[4738]: E0307 07:03:25.685904 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:05:27.685893014 +0000 UTC m=+346.150880375 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.385605 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.385650 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.385719 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.385750 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:27 crc kubenswrapper[4738]: E0307 07:03:27.387190 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:27 crc kubenswrapper[4738]: E0307 07:03:27.387264 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:27 crc kubenswrapper[4738]: E0307 07:03:27.387336 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:27 crc kubenswrapper[4738]: E0307 07:03:27.387395 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.461258 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/1.log" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.461876 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/0.log" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.461954 4738 generic.go:334] "Generic (PLEG): container finished" podID="c0a91659-d53f-4694-82a7-8c66445ab4f5" containerID="13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5" exitCode=1 Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.462001 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerDied","Data":"13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5"} Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.462055 4738 scope.go:117] "RemoveContainer" containerID="a0883249a47d3d7ab400cc33c315ce86af602a642a06ceabbde65fc5d266e62f" Mar 07 07:03:27 crc kubenswrapper[4738]: I0307 07:03:27.462833 4738 scope.go:117] "RemoveContainer" containerID="13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5" Mar 07 07:03:27 crc kubenswrapper[4738]: E0307 07:03:27.463134 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5)\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:03:28 crc kubenswrapper[4738]: I0307 07:03:28.467250 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/1.log" Mar 07 07:03:28 crc kubenswrapper[4738]: E0307 07:03:28.612251 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:29 crc kubenswrapper[4738]: I0307 07:03:29.385275 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:29 crc kubenswrapper[4738]: I0307 07:03:29.385328 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:29 crc kubenswrapper[4738]: I0307 07:03:29.385337 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:29 crc kubenswrapper[4738]: E0307 07:03:29.385467 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:29 crc kubenswrapper[4738]: E0307 07:03:29.385671 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:29 crc kubenswrapper[4738]: E0307 07:03:29.385892 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:29 crc kubenswrapper[4738]: I0307 07:03:29.385965 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:29 crc kubenswrapper[4738]: E0307 07:03:29.386329 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:31 crc kubenswrapper[4738]: I0307 07:03:31.385719 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:31 crc kubenswrapper[4738]: I0307 07:03:31.385717 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:31 crc kubenswrapper[4738]: I0307 07:03:31.385730 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:31 crc kubenswrapper[4738]: I0307 07:03:31.385981 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:31 crc kubenswrapper[4738]: E0307 07:03:31.386121 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:31 crc kubenswrapper[4738]: E0307 07:03:31.386401 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:31 crc kubenswrapper[4738]: E0307 07:03:31.386533 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:31 crc kubenswrapper[4738]: E0307 07:03:31.386806 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:33 crc kubenswrapper[4738]: I0307 07:03:33.384603 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:33 crc kubenswrapper[4738]: I0307 07:03:33.384701 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:33 crc kubenswrapper[4738]: I0307 07:03:33.384603 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:33 crc kubenswrapper[4738]: E0307 07:03:33.384757 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:33 crc kubenswrapper[4738]: I0307 07:03:33.384628 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:33 crc kubenswrapper[4738]: E0307 07:03:33.384906 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:33 crc kubenswrapper[4738]: E0307 07:03:33.385120 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:33 crc kubenswrapper[4738]: E0307 07:03:33.385149 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:33 crc kubenswrapper[4738]: E0307 07:03:33.614918 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:34 crc kubenswrapper[4738]: I0307 07:03:34.387369 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.274772 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qpkbn"] Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.274975 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:35 crc kubenswrapper[4738]: E0307 07:03:35.275099 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.385350 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.385405 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.385467 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:35 crc kubenswrapper[4738]: E0307 07:03:35.385517 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:35 crc kubenswrapper[4738]: E0307 07:03:35.385694 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:35 crc kubenswrapper[4738]: E0307 07:03:35.385763 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.497731 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/3.log" Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.501717 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerStarted","Data":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} Mar 07 07:03:35 crc kubenswrapper[4738]: I0307 07:03:35.502388 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:03:37 crc kubenswrapper[4738]: I0307 07:03:37.385017 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:37 crc kubenswrapper[4738]: I0307 07:03:37.385017 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:37 crc kubenswrapper[4738]: E0307 07:03:37.385205 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:37 crc kubenswrapper[4738]: I0307 07:03:37.385042 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:37 crc kubenswrapper[4738]: I0307 07:03:37.385017 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:37 crc kubenswrapper[4738]: E0307 07:03:37.385386 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:37 crc kubenswrapper[4738]: E0307 07:03:37.385469 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:37 crc kubenswrapper[4738]: E0307 07:03:37.385538 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:38 crc kubenswrapper[4738]: I0307 07:03:38.386759 4738 scope.go:117] "RemoveContainer" containerID="13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5" Mar 07 07:03:38 crc kubenswrapper[4738]: I0307 07:03:38.414699 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podStartSLOduration=166.414660726 podStartE2EDuration="2m46.414660726s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:35.549287326 +0000 UTC m=+234.014274647" watchObservedRunningTime="2026-03-07 07:03:38.414660726 +0000 UTC m=+236.879648087" Mar 07 07:03:38 crc kubenswrapper[4738]: E0307 07:03:38.616361 4738 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.385239 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.385299 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.385255 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.385255 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:39 crc kubenswrapper[4738]: E0307 07:03:39.385449 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:39 crc kubenswrapper[4738]: E0307 07:03:39.385633 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:39 crc kubenswrapper[4738]: E0307 07:03:39.385825 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:39 crc kubenswrapper[4738]: E0307 07:03:39.385731 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.523006 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/1.log" Mar 07 07:03:39 crc kubenswrapper[4738]: I0307 07:03:39.523092 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerStarted","Data":"3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345"} Mar 07 07:03:41 crc kubenswrapper[4738]: I0307 07:03:41.385363 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:41 crc kubenswrapper[4738]: E0307 07:03:41.385597 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:41 crc kubenswrapper[4738]: I0307 07:03:41.385915 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:41 crc kubenswrapper[4738]: E0307 07:03:41.386052 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:41 crc kubenswrapper[4738]: I0307 07:03:41.386327 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:41 crc kubenswrapper[4738]: E0307 07:03:41.386442 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:41 crc kubenswrapper[4738]: I0307 07:03:41.386539 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:41 crc kubenswrapper[4738]: E0307 07:03:41.386842 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.385600 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.385617 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:43 crc kubenswrapper[4738]: E0307 07:03:43.386152 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-qpkbn" podUID="ba7ca967-58f7-4944-81d8-7bb8957707ad" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.385771 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:43 crc kubenswrapper[4738]: E0307 07:03:43.386360 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.385687 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:43 crc kubenswrapper[4738]: E0307 07:03:43.386536 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:03:43 crc kubenswrapper[4738]: E0307 07:03:43.386698 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.904370 4738 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.954910 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.957836 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.961314 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.962473 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.962670 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.965707 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.965964 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.966591 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.971312 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-rvllr"] Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.972129 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.981634 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.981872 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.981952 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.982096 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.982221 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.982872 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.983305 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.986133 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.987356 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2cw7"] Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.987994 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:43 crc kubenswrapper[4738]: I0307 07:03:43.999154 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.004032 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.004549 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.005079 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.006241 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.006409 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.007040 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.007975 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.008183 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.008702 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.008843 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.009595 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.013983 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdqgb\" (UniqueName: \"kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.024573 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.024651 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.011596 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018336 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjdr5"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.011776 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.011854 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.025921 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.025815 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.012266 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.012327 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.014873 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.017877 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018000 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018025 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018130 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018171 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018201 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018263 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.018313 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.049982 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.050118 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.050348 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.051025 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.051248 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.051381 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rb6b7"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.051760 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.051969 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dflfd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.052318 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4nw58"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.052742 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.053023 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k66nh"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.053469 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8xdj4"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.053756 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.059232 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.059685 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.059996 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.062389 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7vtnd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.062822 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.062824 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.062989 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.063085 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.063475 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.063558 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.063601 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.069895 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-wwk6z"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.070251 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.070714 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084074 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084317 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084476 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084668 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084771 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.084878 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.085327 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.085477 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.086365 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.086710 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.087114 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.087476 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-k9g6h"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.088018 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.095423 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.097998 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098010 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098154 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098435 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098556 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098698 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.098817 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099358 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099770 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099872 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099779 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099810 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.099842 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.101899 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.102057 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.105196 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.102090 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.102128 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.104208 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.104261 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.104318 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.104450 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.104847 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.107113 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.107332 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.107635 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.107917 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.108200 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.108487 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.108662 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.108701 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.108821 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109087 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109104 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109313 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109338 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109414 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109448 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109497 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109571 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109581 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109657 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109700 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109743 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109819 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109860 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109824 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109937 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.109996 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.110009 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.112513 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.114474 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165075 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165127 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165154 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-oauth-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165195 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62300cba-ef24-4937-90da-6e210a59bc66-audit-dir\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165215 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v75cf\" (UniqueName: \"kubernetes.io/projected/8dbe9f44-a77f-4664-8076-edfeba6967ec-kube-api-access-v75cf\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165242 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165261 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxvm5\" (UniqueName: \"kubernetes.io/projected/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-kube-api-access-gxvm5\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165284 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-node-pullsecrets\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165308 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165334 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-config\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165356 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pnhz\" (UniqueName: \"kubernetes.io/projected/de384384-33aa-408f-a998-bf6b3e79e0a8-kube-api-access-7pnhz\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165373 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-image-import-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165391 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165411 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-config\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165435 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2326371b-8e91-4a56-b615-5b98c7f93b79-config\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165458 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr9pn\" (UniqueName: \"kubernetes.io/projected/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-kube-api-access-pr9pn\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165481 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165506 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-serving-cert\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165535 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit-dir\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165559 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92b2l\" (UniqueName: \"kubernetes.io/projected/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-kube-api-access-92b2l\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165590 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165616 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqxf5\" (UniqueName: \"kubernetes.io/projected/62300cba-ef24-4937-90da-6e210a59bc66-kube-api-access-wqxf5\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165639 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165662 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-service-ca\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165687 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-trusted-ca\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165714 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdntv\" (UniqueName: \"kubernetes.io/projected/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-kube-api-access-rdntv\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165742 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-config\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165766 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165784 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2326371b-8e91-4a56-b615-5b98c7f93b79-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165842 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165868 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165897 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-audit-policies\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165920 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-config\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165941 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-serving-cert\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165963 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-encryption-config\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165984 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-trusted-ca-bundle\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166008 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469adfed-daf6-48e4-9c59-c6823e3a4143-serving-cert\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166029 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166049 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-serving-cert\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166080 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166103 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4t5l\" (UniqueName: \"kubernetes.io/projected/469adfed-daf6-48e4-9c59-c6823e3a4143-kube-api-access-p4t5l\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166125 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-service-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166144 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-serving-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166178 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-encryption-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166198 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-service-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166221 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-images\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166239 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-client\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166267 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-client\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166287 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/de384384-33aa-408f-a998-bf6b3e79e0a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166303 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2326371b-8e91-4a56-b615-5b98c7f93b79-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166323 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mn47\" (UniqueName: \"kubernetes.io/projected/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-kube-api-access-6mn47\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166345 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-etcd-client\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166378 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlb6\" (UniqueName: \"kubernetes.io/projected/a89ff68b-6456-4756-b991-5ab40a3f3dfe-kube-api-access-cxlb6\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166396 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-metrics-tls\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166412 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-console-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166434 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdqgb\" (UniqueName: \"kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166456 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166489 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166517 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166540 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbe9f44-a77f-4664-8076-edfeba6967ec-serving-cert\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166563 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-oauth-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.166584 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sccf8\" (UniqueName: \"kubernetes.io/projected/e863c889-47c2-459d-a84f-dc360fe3098f-kube-api-access-sccf8\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.165656 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.175982 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.175991 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.176916 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.177115 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.177204 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.177398 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.178881 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.179033 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.181037 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.181806 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.182181 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.182696 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.183127 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.183719 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.185370 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wj9wt"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.186037 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm4bq"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.186147 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.186572 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.186792 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.186605 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.187631 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.188238 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.189740 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.190379 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.190910 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.191400 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.191590 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.192422 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.194980 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.201454 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.201948 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.207136 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.207963 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.209989 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.210706 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.211982 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.216439 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kblzd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.216632 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.217219 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547782-mrbc5"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.217364 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.217891 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.219705 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.220805 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.222526 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.223923 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.224734 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.226755 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.227782 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.229875 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.230078 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.231571 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.233066 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvllr"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.234937 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rwwsd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.235474 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.238368 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8xdj4"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.239552 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.240429 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.241635 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.242735 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.243928 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rb6b7"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.245217 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.245889 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.247050 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wj9wt"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.250761 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2cw7"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.254400 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm4bq"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.261556 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.269733 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-images\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.270861 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-images\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.270975 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-client\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271005 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkqzj\" (UniqueName: \"kubernetes.io/projected/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-kube-api-access-tkqzj\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271029 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-client\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271048 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlb6\" (UniqueName: \"kubernetes.io/projected/a89ff68b-6456-4756-b991-5ab40a3f3dfe-kube-api-access-cxlb6\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271069 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/de384384-33aa-408f-a998-bf6b3e79e0a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271089 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2326371b-8e91-4a56-b615-5b98c7f93b79-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271107 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mn47\" (UniqueName: \"kubernetes.io/projected/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-kube-api-access-6mn47\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271124 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-etcd-client\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271710 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-console-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271768 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-metrics-tls\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271814 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271842 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-oauth-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271871 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbe9f44-a77f-4664-8076-edfeba6967ec-serving-cert\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271891 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sccf8\" (UniqueName: \"kubernetes.io/projected/e863c889-47c2-459d-a84f-dc360fe3098f-kube-api-access-sccf8\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271914 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271937 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271955 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-oauth-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.271977 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxvm5\" (UniqueName: \"kubernetes.io/projected/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-kube-api-access-gxvm5\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272005 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62300cba-ef24-4937-90da-6e210a59bc66-audit-dir\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v75cf\" (UniqueName: \"kubernetes.io/projected/8dbe9f44-a77f-4664-8076-edfeba6967ec-kube-api-access-v75cf\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272052 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272077 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slfls\" (UniqueName: \"kubernetes.io/projected/3d9a181b-98f6-4420-90fb-6df876c703a0-kube-api-access-slfls\") pod \"downloads-7954f5f757-wwk6z\" (UID: \"3d9a181b-98f6-4420-90fb-6df876c703a0\") " pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272099 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-node-pullsecrets\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272119 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272138 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-config\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272179 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pnhz\" (UniqueName: \"kubernetes.io/projected/de384384-33aa-408f-a998-bf6b3e79e0a8-kube-api-access-7pnhz\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272199 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-image-import-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272216 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272233 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-config\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272253 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-service-ca-bundle\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272275 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2326371b-8e91-4a56-b615-5b98c7f93b79-config\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272296 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-stats-auth\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272324 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr9pn\" (UniqueName: \"kubernetes.io/projected/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-kube-api-access-pr9pn\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272343 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272365 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-serving-cert\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272383 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit-dir\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272401 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92b2l\" (UniqueName: \"kubernetes.io/projected/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-kube-api-access-92b2l\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272425 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-metrics-certs\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272445 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8c5f\" (UniqueName: \"kubernetes.io/projected/6b20a433-e331-4429-9cc9-041080de016b-kube-api-access-z8c5f\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272494 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdntv\" (UniqueName: \"kubernetes.io/projected/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-kube-api-access-rdntv\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272519 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272548 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqxf5\" (UniqueName: \"kubernetes.io/projected/62300cba-ef24-4937-90da-6e210a59bc66-kube-api-access-wqxf5\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272568 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272591 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-service-ca\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272609 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-trusted-ca\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272630 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-default-certificate\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272658 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-config\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272686 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272753 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2326371b-8e91-4a56-b615-5b98c7f93b79-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272781 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b20a433-e331-4429-9cc9-041080de016b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272845 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272865 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-audit-policies\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272885 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-encryption-config\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272910 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-config\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272930 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-serving-cert\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272950 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272969 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-trusted-ca-bundle\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.272988 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469adfed-daf6-48e4-9c59-c6823e3a4143-serving-cert\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273010 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-serving-cert\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273044 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273063 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4t5l\" (UniqueName: \"kubernetes.io/projected/469adfed-daf6-48e4-9c59-c6823e3a4143-kube-api-access-p4t5l\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273089 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-service-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273107 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-serving-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273124 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-encryption-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273143 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-service-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273513 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit-dir\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.273932 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-service-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274144 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-etcd-client\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274341 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-console-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274573 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-client\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274824 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274855 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2326371b-8e91-4a56-b615-5b98c7f93b79-config\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.274960 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-config\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.275148 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/de384384-33aa-408f-a998-bf6b3e79e0a8-config\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.275565 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-service-ca\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.275700 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-audit\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.275948 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.276203 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-audit-policies\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.276616 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-oauth-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.276669 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.276877 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-metrics-tls\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.276970 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/62300cba-ef24-4937-90da-6e210a59bc66-audit-dir\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277287 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-serving-cert\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277366 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-client\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277414 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/de384384-33aa-408f-a998-bf6b3e79e0a8-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277465 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-config\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277796 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e863c889-47c2-459d-a84f-dc360fe3098f-trusted-ca-bundle\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277935 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-etcd-serving-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.277951 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/62300cba-ef24-4937-90da-6e210a59bc66-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.278128 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-trusted-ca\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.278384 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8dbe9f44-a77f-4664-8076-edfeba6967ec-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.278793 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-etcd-service-ca\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.279407 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-serving-cert\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.279474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e863c889-47c2-459d-a84f-dc360fe3098f-console-oauth-config\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.280052 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469adfed-daf6-48e4-9c59-c6823e3a4143-config\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.280100 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/a89ff68b-6456-4756-b991-5ab40a3f3dfe-node-pullsecrets\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.280507 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.280797 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-serving-cert\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.281123 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-image-import-ca\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.281582 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a89ff68b-6456-4756-b991-5ab40a3f3dfe-trusted-ca-bundle\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.281764 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.281765 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.281848 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a89ff68b-6456-4756-b991-5ab40a3f3dfe-encryption-config\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.282074 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.282118 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/469adfed-daf6-48e4-9c59-c6823e3a4143-serving-cert\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.282361 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.283379 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2326371b-8e91-4a56-b615-5b98c7f93b79-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.283520 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.283859 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8dbe9f44-a77f-4664-8076-edfeba6967ec-serving-cert\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.284492 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-encryption-config\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.284907 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62300cba-ef24-4937-90da-6e210a59bc66-serving-cert\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.295523 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.297277 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.298645 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.300288 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.300453 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wwk6z"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.301003 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.302204 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7vtnd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.303426 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k66nh"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.305778 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.307651 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4nw58"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.309318 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.310881 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-rm5z9"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.311774 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.312550 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lqf85"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.315084 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.315219 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.315706 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-mrbc5"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.317362 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.319013 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.319898 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.320619 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.322602 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dflfd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.324007 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjdr5"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.325608 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kblzd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.326819 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rwwsd"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.327992 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.330370 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.331689 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.332781 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lqf85"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.333990 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.335568 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.336567 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-lmc29"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.337765 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lmc29"] Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.337776 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.340291 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.361149 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374387 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tkqzj\" (UniqueName: \"kubernetes.io/projected/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-kube-api-access-tkqzj\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374515 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slfls\" (UniqueName: \"kubernetes.io/projected/3d9a181b-98f6-4420-90fb-6df876c703a0-kube-api-access-slfls\") pod \"downloads-7954f5f757-wwk6z\" (UID: \"3d9a181b-98f6-4420-90fb-6df876c703a0\") " pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374556 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-service-ca-bundle\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374580 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-stats-auth\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374626 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-metrics-certs\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374650 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8c5f\" (UniqueName: \"kubernetes.io/projected/6b20a433-e331-4429-9cc9-041080de016b-kube-api-access-z8c5f\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374698 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-default-certificate\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.374737 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b20a433-e331-4429-9cc9-041080de016b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.380226 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.389516 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6b20a433-e331-4429-9cc9-041080de016b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.401032 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.423366 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.429260 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-stats-auth\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.441035 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.449766 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-default-certificate\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.459964 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.479784 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.486261 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-service-ca-bundle\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.500560 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.520847 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.530054 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-metrics-certs\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.539856 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.609680 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.612128 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdqgb\" (UniqueName: \"kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb\") pod \"route-controller-manager-6576b87f9c-5j9bx\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.620218 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.640566 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.661712 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.681324 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.700317 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.722330 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.740968 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.761219 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.781688 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.801106 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.820446 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.841241 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.864676 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.877883 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.881130 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.901293 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.920700 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.941624 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.961427 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 07:03:44 crc kubenswrapper[4738]: I0307 07:03:44.982216 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.001106 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.022609 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.043256 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.060656 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.080564 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.100796 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.121384 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.143572 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.167044 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.181121 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.198860 4738 request.go:700] Waited for 1.010343767s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.200751 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.221062 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.241074 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.263374 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.288912 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.299337 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.320904 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.340304 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.361052 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.384705 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.384977 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.384985 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.385583 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.389887 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.401057 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.421931 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.441033 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.461285 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.480865 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.501797 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.520683 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.542447 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.558321 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" event={"ID":"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c","Type":"ContainerStarted","Data":"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162"} Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.558409 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" event={"ID":"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c","Type":"ContainerStarted","Data":"db35ddaedb3788339ed0a6bdfa4a8f715fd56601871c6cb2514ca0094971f58a"} Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.558893 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.560609 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.565004 4738 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5j9bx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.565095 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.580250 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.600901 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.620730 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.640987 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.661889 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.681274 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.701616 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.722421 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.740578 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.760474 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.780195 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.800206 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.821249 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.840275 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.860769 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.881896 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.900962 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.920658 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.967717 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlb6\" (UniqueName: \"kubernetes.io/projected/a89ff68b-6456-4756-b991-5ab40a3f3dfe-kube-api-access-cxlb6\") pod \"apiserver-76f77b778f-rb6b7\" (UID: \"a89ff68b-6456-4756-b991-5ab40a3f3dfe\") " pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:45 crc kubenswrapper[4738]: I0307 07:03:45.984204 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2326371b-8e91-4a56-b615-5b98c7f93b79-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-qtvn2\" (UID: \"2326371b-8e91-4a56-b615-5b98c7f93b79\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.011645 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.012297 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mn47\" (UniqueName: \"kubernetes.io/projected/9b53e2fb-773e-4c0e-a251-c0fdc80b66f8-kube-api-access-6mn47\") pod \"openshift-config-operator-7777fb866f-k66nh\" (UID: \"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.021839 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.041129 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92b2l\" (UniqueName: \"kubernetes.io/projected/7ac92454-dfca-4c2f-aa74-a1226bfd9f66-kube-api-access-92b2l\") pod \"dns-operator-744455d44c-4nw58\" (UID: \"7ac92454-dfca-4c2f-aa74-a1226bfd9f66\") " pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.047788 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqxf5\" (UniqueName: \"kubernetes.io/projected/62300cba-ef24-4937-90da-6e210a59bc66-kube-api-access-wqxf5\") pod \"apiserver-7bbb656c7d-ksjbb\" (UID: \"62300cba-ef24-4937-90da-6e210a59bc66\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.056337 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.071749 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr9pn\" (UniqueName: \"kubernetes.io/projected/6a49d6b8-36d5-48de-90fc-1fcd8f172eab-kube-api-access-pr9pn\") pod \"openshift-apiserver-operator-796bbdcf4f-mfdxf\" (UID: \"6a49d6b8-36d5-48de-90fc-1fcd8f172eab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.080313 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxvm5\" (UniqueName: \"kubernetes.io/projected/6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63-kube-api-access-gxvm5\") pod \"cluster-samples-operator-665b6dd947-4grdb\" (UID: \"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.100656 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdntv\" (UniqueName: \"kubernetes.io/projected/82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef-kube-api-access-rdntv\") pod \"etcd-operator-b45778765-8xdj4\" (UID: \"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef\") " pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.121704 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sccf8\" (UniqueName: \"kubernetes.io/projected/e863c889-47c2-459d-a84f-dc360fe3098f-kube-api-access-sccf8\") pod \"console-f9d7485db-rvllr\" (UID: \"e863c889-47c2-459d-a84f-dc360fe3098f\") " pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.136823 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v75cf\" (UniqueName: \"kubernetes.io/projected/8dbe9f44-a77f-4664-8076-edfeba6967ec-kube-api-access-v75cf\") pod \"authentication-operator-69f744f599-dflfd\" (UID: \"8dbe9f44-a77f-4664-8076-edfeba6967ec\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.140042 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.172836 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pnhz\" (UniqueName: \"kubernetes.io/projected/de384384-33aa-408f-a998-bf6b3e79e0a8-kube-api-access-7pnhz\") pod \"machine-api-operator-5694c8668f-wjdr5\" (UID: \"de384384-33aa-408f-a998-bf6b3e79e0a8\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.181619 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.183884 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4t5l\" (UniqueName: \"kubernetes.io/projected/469adfed-daf6-48e4-9c59-c6823e3a4143-kube-api-access-p4t5l\") pod \"console-operator-58897d9998-w2cw7\" (UID: \"469adfed-daf6-48e4-9c59-c6823e3a4143\") " pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.186029 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.201336 4738 request.go:700] Waited for 1.889176938s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.201471 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.203638 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.208367 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.221777 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.239971 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.264985 4738 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.279135 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.282543 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.301441 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.307868 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.328182 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.349193 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.349999 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" Mar 07 07:03:46 crc kubenswrapper[4738]: W0307 07:03:46.350538 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2326371b_8e91_4a56_b615_5b98c7f93b79.slice/crio-321b77a193df286e9ac4c2f52ed13c2ff3f32c790c8eab23e5c94826b886813f WatchSource:0}: Error finding container 321b77a193df286e9ac4c2f52ed13c2ff3f32c790c8eab23e5c94826b886813f: Status 404 returned error can't find the container with id 321b77a193df286e9ac4c2f52ed13c2ff3f32c790c8eab23e5c94826b886813f Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.350626 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.402457 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkqzj\" (UniqueName: \"kubernetes.io/projected/f752fd6d-5074-4f1f-ae2e-f1c4225536f6-kube-api-access-tkqzj\") pod \"router-default-5444994796-k9g6h\" (UID: \"f752fd6d-5074-4f1f-ae2e-f1c4225536f6\") " pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.413979 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.414460 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.421187 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slfls\" (UniqueName: \"kubernetes.io/projected/3d9a181b-98f6-4420-90fb-6df876c703a0-kube-api-access-slfls\") pod \"downloads-7954f5f757-wwk6z\" (UID: \"3d9a181b-98f6-4420-90fb-6df876c703a0\") " pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.441357 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-k66nh"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.441399 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-rb6b7"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.456900 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.460698 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.470330 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8c5f\" (UniqueName: \"kubernetes.io/projected/6b20a433-e331-4429-9cc9-041080de016b-kube-api-access-z8c5f\") pod \"control-plane-machine-set-operator-78cbb6b69f-f9m4m\" (UID: \"6b20a433-e331-4429-9cc9-041080de016b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.480919 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.500353 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.508496 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516844 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516878 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z5mz\" (UniqueName: \"kubernetes.io/projected/c4420b34-f92a-44fa-bd21-edbbc50d3e09-kube-api-access-4z5mz\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516913 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516935 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-auth-proxy-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516955 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516979 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mp4f\" (UniqueName: \"kubernetes.io/projected/892836f4-a16e-4521-a239-ee48bfc834d2-kube-api-access-2mp4f\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.516998 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517021 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lg2v\" (UniqueName: \"kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517040 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517076 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5f6g\" (UniqueName: \"kubernetes.io/projected/00b539ba-07e8-41d9-a8fb-b6b43d051edc-kube-api-access-k5f6g\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517094 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517111 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-cabundle\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517179 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517253 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517281 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4420b34-f92a-44fa-bd21-edbbc50d3e09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517299 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517345 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517362 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4420b34-f92a-44fa-bd21-edbbc50d3e09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517418 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517442 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517503 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517520 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517606 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517624 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517641 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a268242a-d0f5-4e03-9d87-cb5a26701223-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517667 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6xpq\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-kube-api-access-p6xpq\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517696 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517713 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517759 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4j7v\" (UniqueName: \"kubernetes.io/projected/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-kube-api-access-b4j7v\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517801 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmxv\" (UniqueName: \"kubernetes.io/projected/a268242a-d0f5-4e03-9d87-cb5a26701223-kube-api-access-9xmxv\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517825 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146ee606-df28-4fb2-accf-391a835c0cf2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517842 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517860 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517878 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517895 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b28zr\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517911 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-key\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517926 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517945 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517970 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b539ba-07e8-41d9-a8fb-b6b43d051edc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.517987 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518019 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518048 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb4rh\" (UniqueName: \"kubernetes.io/projected/ef85004e-b70b-4eb0-9551-9436944618dc-kube-api-access-zb4rh\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518064 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892836f4-a16e-4521-a239-ee48bfc834d2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518102 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtqlw\" (UniqueName: \"kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518151 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518183 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146ee606-df28-4fb2-accf-391a835c0cf2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518199 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/892836f4-a16e-4521-a239-ee48bfc834d2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518226 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518242 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef85004e-b70b-4eb0-9551-9436944618dc-machine-approver-tls\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.518278 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.520472 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.020459723 +0000 UTC m=+245.485447034 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.521450 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.542862 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 07:03:46 crc kubenswrapper[4738]: W0307 07:03:46.550645 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a49d6b8_36d5_48de_90fc_1fcd8f172eab.slice/crio-4cd3ca3844a09bba63bc62ae8aa9627ea89e004f245b3ecdb32619893884ad38 WatchSource:0}: Error finding container 4cd3ca3844a09bba63bc62ae8aa9627ea89e004f245b3ecdb32619893884ad38: Status 404 returned error can't find the container with id 4cd3ca3844a09bba63bc62ae8aa9627ea89e004f245b3ecdb32619893884ad38 Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.561216 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.582466 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" event={"ID":"6a49d6b8-36d5-48de-90fc-1fcd8f172eab","Type":"ContainerStarted","Data":"4cd3ca3844a09bba63bc62ae8aa9627ea89e004f245b3ecdb32619893884ad38"} Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.585245 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" event={"ID":"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8","Type":"ContainerStarted","Data":"cbe10e1060155536e6818abe331c0fdd30b75577b7f0235b4bb96c32cd2c48c2"} Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.586614 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" event={"ID":"2326371b-8e91-4a56-b615-5b98c7f93b79","Type":"ContainerStarted","Data":"321b77a193df286e9ac4c2f52ed13c2ff3f32c790c8eab23e5c94826b886813f"} Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.587520 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" event={"ID":"a89ff68b-6456-4756-b991-5ab40a3f3dfe","Type":"ContainerStarted","Data":"e97f011781bbe847f25e6b48b3eac6a08160b191f40af98888659145cde6cd12"} Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.588730 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k9g6h" event={"ID":"f752fd6d-5074-4f1f-ae2e-f1c4225536f6","Type":"ContainerStarted","Data":"1552dd3a2fb01d39ec8f257f3a2f2682747f8807313ebd09fca63806d00365d9"} Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.600405 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.618786 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.618966 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.618993 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-srv-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619022 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzzzt\" (UniqueName: \"kubernetes.io/projected/2d86bd56-4321-49e4-9a52-8591e1060490-kube-api-access-kzzzt\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619053 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619069 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-tmpfs\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619092 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619109 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619124 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619143 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619178 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a268242a-d0f5-4e03-9d87-cb5a26701223-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619213 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6xpq\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-kube-api-access-p6xpq\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619229 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-certs\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619246 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-apiservice-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619262 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619279 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619322 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4j7v\" (UniqueName: \"kubernetes.io/projected/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-kube-api-access-b4j7v\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619365 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-255nm\" (UniqueName: \"kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619384 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-socket-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619410 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmxv\" (UniqueName: \"kubernetes.io/projected/a268242a-d0f5-4e03-9d87-cb5a26701223-kube-api-access-9xmxv\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619428 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d86bd56-4321-49e4-9a52-8591e1060490-proxy-tls\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619442 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-registration-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.619460 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prfzs\" (UniqueName: \"kubernetes.io/projected/2e1d9fee-954d-417a-8bba-28710f7a4bfa-kube-api-access-prfzs\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.620293 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.120257586 +0000 UTC m=+245.585245067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.624288 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.624774 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.624779 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.624856 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.625969 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627265 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146ee606-df28-4fb2-accf-391a835c0cf2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627309 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-node-bootstrap-token\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627326 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627351 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627373 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-244sm\" (UniqueName: \"kubernetes.io/projected/429c8ef3-3ebe-458d-a6fd-75f52e58c540-kube-api-access-244sm\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627397 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627415 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a8c4e0e-803d-4706-a411-669e05829f9c-config-volume\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627446 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9jvc\" (UniqueName: \"kubernetes.io/projected/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-kube-api-access-n9jvc\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627462 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8ef3-3ebe-458d-a6fd-75f52e58c540-serving-cert\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w9gt\" (UniqueName: \"kubernetes.io/projected/a0826ee3-cdd2-4c67-9b92-4a33512e6f13-kube-api-access-6w9gt\") pod \"migrator-59844c95c7-qx5sp\" (UID: \"a0826ee3-cdd2-4c67-9b92-4a33512e6f13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627522 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-images\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627521 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627545 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627569 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b28zr\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627586 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-webhook-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627622 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-key\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627639 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627659 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627677 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b539ba-07e8-41d9-a8fb-b6b43d051edc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627706 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627723 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e1d9fee-954d-417a-8bba-28710f7a4bfa-proxy-tls\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627741 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmxrs\" (UniqueName: \"kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs\") pod \"auto-csr-approver-29547782-mrbc5\" (UID: \"09f978c5-7fd4-4852-95c4-915304c1bf18\") " pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627759 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b76v\" (UniqueName: \"kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627791 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627817 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5559595-a2f4-43ac-903f-9b9e418bf435-config\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627837 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbf4\" (UniqueName: \"kubernetes.io/projected/4a8c4e0e-803d-4706-a411-669e05829f9c-kube-api-access-fkbf4\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627856 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-plugins-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627877 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb4rh\" (UniqueName: \"kubernetes.io/projected/ef85004e-b70b-4eb0-9551-9436944618dc-kube-api-access-zb4rh\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627893 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892836f4-a16e-4521-a239-ee48bfc834d2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627916 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtqlw\" (UniqueName: \"kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627931 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5559595-a2f4-43ac-903f-9b9e418bf435-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627960 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627978 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146ee606-df28-4fb2-accf-391a835c0cf2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.627999 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/892836f4-a16e-4521-a239-ee48bfc834d2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628019 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ebf6ab-7086-429d-9b03-fdc278235e3b-trusted-ca\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628048 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628064 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef85004e-b70b-4eb0-9551-9436944618dc-machine-approver-tls\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628081 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1ebf6ab-7086-429d-9b03-fdc278235e3b-metrics-tls\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628095 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-mountpoint-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628112 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628130 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5559595-a2f4-43ac-903f-9b9e418bf435-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628155 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628184 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-srv-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628202 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628199 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628219 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9nc6\" (UniqueName: \"kubernetes.io/projected/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-kube-api-access-p9nc6\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628283 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628311 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-cert\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628340 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z5mz\" (UniqueName: \"kubernetes.io/projected/c4420b34-f92a-44fa-bd21-edbbc50d3e09-kube-api-access-4z5mz\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628363 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhp2c\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-kube-api-access-hhp2c\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628390 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628411 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628458 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-auth-proxy-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628480 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628501 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4w7\" (UniqueName: \"kubernetes.io/projected/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-kube-api-access-zf4w7\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628545 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mp4f\" (UniqueName: \"kubernetes.io/projected/892836f4-a16e-4521-a239-ee48bfc834d2-kube-api-access-2mp4f\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628583 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628606 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lg2v\" (UniqueName: \"kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628628 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628654 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5f6g\" (UniqueName: \"kubernetes.io/projected/00b539ba-07e8-41d9-a8fb-b6b43d051edc-kube-api-access-k5f6g\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628673 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628691 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-cabundle\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628709 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628741 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d86bd56-4321-49e4-9a52-8591e1060490-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628768 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/429c8ef3-3ebe-458d-a6fd-75f52e58c540-config\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628789 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-csi-data-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628811 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628829 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87dg2\" (UniqueName: \"kubernetes.io/projected/599d27ce-56f1-452b-bb67-2ad0178c9d61-kube-api-access-87dg2\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628852 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4420b34-f92a-44fa-bd21-edbbc50d3e09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628874 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8zt\" (UniqueName: \"kubernetes.io/projected/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-kube-api-access-7r8zt\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628894 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-profile-collector-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628914 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628947 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.628970 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4420b34-f92a-44fa-bd21-edbbc50d3e09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.629002 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.629032 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a8c4e0e-803d-4706-a411-669e05829f9c-metrics-tls\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.629052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js2jq\" (UniqueName: \"kubernetes.io/projected/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-kube-api-access-js2jq\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.629758 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/146ee606-df28-4fb2-accf-391a835c0cf2-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.630121 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.630305 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ef85004e-b70b-4eb0-9551-9436944618dc-auth-proxy-config\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.630610 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.13059365 +0000 UTC m=+245.595580971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.633177 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.633416 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-cabundle\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.633901 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/892836f4-a16e-4521-a239-ee48bfc834d2-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.634186 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.634685 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4420b34-f92a-44fa-bd21-edbbc50d3e09-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.634804 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.635004 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.636310 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.639848 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/892836f4-a16e-4521-a239-ee48bfc834d2-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.641944 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4420b34-f92a-44fa-bd21-edbbc50d3e09-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.642250 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/ef85004e-b70b-4eb0-9551-9436944618dc-machine-approver-tls\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.643523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.643821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.644982 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.646334 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/a268242a-d0f5-4e03-9d87-cb5a26701223-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.654200 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.656473 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/146ee606-df28-4fb2-accf-391a835c0cf2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.656793 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/00b539ba-07e8-41d9-a8fb-b6b43d051edc-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.657629 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.658088 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.661720 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.661808 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.662272 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-signing-key\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.662533 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.662768 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.669099 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.670844 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.673509 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.691332 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6xpq\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-kube-api-access-p6xpq\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.699631 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmxv\" (UniqueName: \"kubernetes.io/projected/a268242a-d0f5-4e03-9d87-cb5a26701223-kube-api-access-9xmxv\") pod \"multus-admission-controller-857f4d67dd-wj9wt\" (UID: \"a268242a-d0f5-4e03-9d87-cb5a26701223\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.718738 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4j7v\" (UniqueName: \"kubernetes.io/projected/32a48345-2d7c-4d3b-9bc7-d8e20537e3c6-kube-api-access-b4j7v\") pod \"service-ca-9c57cc56f-vm4bq\" (UID: \"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6\") " pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729561 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.729749 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.229717346 +0000 UTC m=+245.694704667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729814 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a8c4e0e-803d-4706-a411-669e05829f9c-metrics-tls\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729868 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js2jq\" (UniqueName: \"kubernetes.io/projected/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-kube-api-access-js2jq\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729887 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-srv-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729908 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kzzzt\" (UniqueName: \"kubernetes.io/projected/2d86bd56-4321-49e4-9a52-8591e1060490-kube-api-access-kzzzt\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729925 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-tmpfs\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729968 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.729992 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730029 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-certs\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730045 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-apiservice-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730069 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730094 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-255nm\" (UniqueName: \"kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730114 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-socket-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730136 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prfzs\" (UniqueName: \"kubernetes.io/projected/2e1d9fee-954d-417a-8bba-28710f7a4bfa-kube-api-access-prfzs\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730359 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d86bd56-4321-49e4-9a52-8591e1060490-proxy-tls\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730383 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-registration-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730413 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-node-bootstrap-token\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730453 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730487 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-244sm\" (UniqueName: \"kubernetes.io/projected/429c8ef3-3ebe-458d-a6fd-75f52e58c540-kube-api-access-244sm\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730509 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a8c4e0e-803d-4706-a411-669e05829f9c-config-volume\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730533 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9jvc\" (UniqueName: \"kubernetes.io/projected/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-kube-api-access-n9jvc\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730551 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8ef3-3ebe-458d-a6fd-75f52e58c540-serving-cert\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730569 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w9gt\" (UniqueName: \"kubernetes.io/projected/a0826ee3-cdd2-4c67-9b92-4a33512e6f13-kube-api-access-6w9gt\") pod \"migrator-59844c95c7-qx5sp\" (UID: \"a0826ee3-cdd2-4c67-9b92-4a33512e6f13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730593 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-images\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730628 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-webhook-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730737 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e1d9fee-954d-417a-8bba-28710f7a4bfa-proxy-tls\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730763 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wmxrs\" (UniqueName: \"kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs\") pod \"auto-csr-approver-29547782-mrbc5\" (UID: \"09f978c5-7fd4-4852-95c4-915304c1bf18\") " pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730785 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b76v\" (UniqueName: \"kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730811 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730830 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5559595-a2f4-43ac-903f-9b9e418bf435-config\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730845 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbf4\" (UniqueName: \"kubernetes.io/projected/4a8c4e0e-803d-4706-a411-669e05829f9c-kube-api-access-fkbf4\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730860 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-plugins-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730884 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5559595-a2f4-43ac-903f-9b9e418bf435-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730913 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ebf6ab-7086-429d-9b03-fdc278235e3b-trusted-ca\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730949 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730964 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1ebf6ab-7086-429d-9b03-fdc278235e3b-metrics-tls\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.730980 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-mountpoint-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731009 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5559595-a2f4-43ac-903f-9b9e418bf435-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731033 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-srv-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731049 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731065 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9nc6\" (UniqueName: \"kubernetes.io/projected/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-kube-api-access-p9nc6\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731119 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-cert\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731144 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhp2c\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-kube-api-access-hhp2c\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731175 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731206 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4w7\" (UniqueName: \"kubernetes.io/projected/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-kube-api-access-zf4w7\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731240 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d86bd56-4321-49e4-9a52-8591e1060490-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731260 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/429c8ef3-3ebe-458d-a6fd-75f52e58c540-config\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731275 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-csi-data-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731295 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87dg2\" (UniqueName: \"kubernetes.io/projected/599d27ce-56f1-452b-bb67-2ad0178c9d61-kube-api-access-87dg2\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731312 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8zt\" (UniqueName: \"kubernetes.io/projected/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-kube-api-access-7r8zt\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.731328 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-profile-collector-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.732699 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-images\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.732794 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-socket-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.735026 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-registration-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.735884 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-tmpfs\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.737039 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-plugins-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.737734 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2d86bd56-4321-49e4-9a52-8591e1060490-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.738045 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-csi-data-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.738097 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.238080337 +0000 UTC m=+245.703067658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.739816 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/429c8ef3-3ebe-458d-a6fd-75f52e58c540-config\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.739915 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4a8c4e0e-803d-4706-a411-669e05829f9c-config-volume\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.739986 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-node-bootstrap-token\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.740087 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.740735 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2e1d9fee-954d-417a-8bba-28710f7a4bfa-auth-proxy-config\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.740821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/599d27ce-56f1-452b-bb67-2ad0178c9d61-mountpoint-dir\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.741436 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.742020 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5559595-a2f4-43ac-903f-9b9e418bf435-config\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.742365 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.743105 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1ebf6ab-7086-429d-9b03-fdc278235e3b-trusted-ca\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.747064 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-profile-collector-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.747515 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c1ebf6ab-7086-429d-9b03-fdc278235e3b-metrics-tls\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.748435 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-profile-collector-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.749524 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-cert\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.754494 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5559595-a2f4-43ac-903f-9b9e418bf435-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.754753 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2d86bd56-4321-49e4-9a52-8591e1060490-proxy-tls\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.755365 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-certs\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.757912 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.758347 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/429c8ef3-3ebe-458d-a6fd-75f52e58c540-serving-cert\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.760265 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.760656 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.760808 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-srv-cert\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.761710 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-apiservice-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.762240 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2e1d9fee-954d-417a-8bba-28710f7a4bfa-proxy-tls\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.762358 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4a8c4e0e-803d-4706-a411-669e05829f9c-metrics-tls\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.762729 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-webhook-cert\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.762843 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-srv-cert\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.768726 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b28zr\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.776748 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lg2v\" (UniqueName: \"kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v\") pod \"controller-manager-879f6c89f-xd7zm\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.793886 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.804766 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/146ee606-df28-4fb2-accf-391a835c0cf2-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-cc5hn\" (UID: \"146ee606-df28-4fb2-accf-391a835c0cf2\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.806454 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.826055 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.829906 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.833623 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.834400 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.334377349 +0000 UTC m=+245.799364670 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.844260 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wjdr5"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.846420 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z5mz\" (UniqueName: \"kubernetes.io/projected/c4420b34-f92a-44fa-bd21-edbbc50d3e09-kube-api-access-4z5mz\") pod \"kube-storage-version-migrator-operator-b67b599dd-7hwl9\" (UID: \"c4420b34-f92a-44fa-bd21-edbbc50d3e09\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.846585 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-4nw58"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.848518 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-w2cw7"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.854911 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b4b70020-a507-4a1c-a80c-7070c9c4b1ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-b7569\" (UID: \"b4b70020-a507-4a1c-a80c-7070c9c4b1ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.866001 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mp4f\" (UniqueName: \"kubernetes.io/projected/892836f4-a16e-4521-a239-ee48bfc834d2-kube-api-access-2mp4f\") pod \"openshift-controller-manager-operator-756b6f6bc6-gptxv\" (UID: \"892836f4-a16e-4521-a239-ee48bfc834d2\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.886974 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-8xdj4"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.889647 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-rvllr"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.904341 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtqlw\" (UniqueName: \"kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw\") pod \"oauth-openshift-558db77b4-7vtnd\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.907400 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.907805 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5f6g\" (UniqueName: \"kubernetes.io/projected/00b539ba-07e8-41d9-a8fb-b6b43d051edc-kube-api-access-k5f6g\") pod \"package-server-manager-789f6589d5-kf278\" (UID: \"00b539ba-07e8-41d9-a8fb-b6b43d051edc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.918991 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-dflfd"] Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.922475 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb4rh\" (UniqueName: \"kubernetes.io/projected/ef85004e-b70b-4eb0-9551-9436944618dc-kube-api-access-zb4rh\") pod \"machine-approver-56656f9798-xj55k\" (UID: \"ef85004e-b70b-4eb0-9551-9436944618dc\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.926946 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.936003 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:46 crc kubenswrapper[4738]: E0307 07:03:46.936502 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.436481063 +0000 UTC m=+245.901468384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:46 crc kubenswrapper[4738]: W0307 07:03:46.958500 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dbe9f44_a77f_4664_8076_edfeba6967ec.slice/crio-ea2b162aede0d7a6683d9c1e5f28fafd6162c59e94cf87479e60c4a335c0506c WatchSource:0}: Error finding container ea2b162aede0d7a6683d9c1e5f28fafd6162c59e94cf87479e60c4a335c0506c: Status 404 returned error can't find the container with id ea2b162aede0d7a6683d9c1e5f28fafd6162c59e94cf87479e60c4a335c0506c Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.961534 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.962973 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w9gt\" (UniqueName: \"kubernetes.io/projected/a0826ee3-cdd2-4c67-9b92-4a33512e6f13-kube-api-access-6w9gt\") pod \"migrator-59844c95c7-qx5sp\" (UID: \"a0826ee3-cdd2-4c67-9b92-4a33512e6f13\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.991112 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js2jq\" (UniqueName: \"kubernetes.io/projected/2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c-kube-api-access-js2jq\") pod \"catalog-operator-68c6474976-pfjd6\" (UID: \"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:46 crc kubenswrapper[4738]: I0307 07:03:46.999185 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prfzs\" (UniqueName: \"kubernetes.io/projected/2e1d9fee-954d-417a-8bba-28710f7a4bfa-kube-api-access-prfzs\") pod \"machine-config-operator-74547568cd-c9btk\" (UID: \"2e1d9fee-954d-417a-8bba-28710f7a4bfa\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.006068 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-wwk6z"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.018863 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.032677 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-244sm\" (UniqueName: \"kubernetes.io/projected/429c8ef3-3ebe-458d-a6fd-75f52e58c540-kube-api-access-244sm\") pod \"service-ca-operator-777779d784-kblzd\" (UID: \"429c8ef3-3ebe-458d-a6fd-75f52e58c540\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.033659 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:47 crc kubenswrapper[4738]: W0307 07:03:47.036096 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d9a181b_98f6_4420_90fb_6df876c703a0.slice/crio-104ea2409262caa034d94af0cad63cc596cfc361e39857a7d5f06ad889b39b92 WatchSource:0}: Error finding container 104ea2409262caa034d94af0cad63cc596cfc361e39857a7d5f06ad889b39b92: Status 404 returned error can't find the container with id 104ea2409262caa034d94af0cad63cc596cfc361e39857a7d5f06ad889b39b92 Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.036548 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.036686 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.536665048 +0000 UTC m=+246.001652379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.036857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.037268 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.537258073 +0000 UTC m=+246.002245394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.055001 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-bound-sa-token\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.068447 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.069081 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzzzt\" (UniqueName: \"kubernetes.io/projected/2d86bd56-4321-49e4-9a52-8591e1060490-kube-api-access-kzzzt\") pod \"machine-config-controller-84d6567774-52lhg\" (UID: \"2d86bd56-4321-49e4-9a52-8591e1060490\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.078924 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4w7\" (UniqueName: \"kubernetes.io/projected/ecc54fff-ab62-4cf3-ae4d-0c25951498c8-kube-api-access-zf4w7\") pod \"packageserver-d55dfcdfc-p7l6z\" (UID: \"ecc54fff-ab62-4cf3-ae4d-0c25951498c8\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.079284 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.097889 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.114108 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.132556 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmxrs\" (UniqueName: \"kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs\") pod \"auto-csr-approver-29547782-mrbc5\" (UID: \"09f978c5-7fd4-4852-95c4-915304c1bf18\") " pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.140825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.141546 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.641519795 +0000 UTC m=+246.106507116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.142618 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-255nm\" (UniqueName: \"kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm\") pod \"collect-profiles-29547780-62qtb\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.143809 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b76v\" (UniqueName: \"kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v\") pod \"marketplace-operator-79b997595-67thp\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.144783 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.159519 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87dg2\" (UniqueName: \"kubernetes.io/projected/599d27ce-56f1-452b-bb67-2ad0178c9d61-kube-api-access-87dg2\") pod \"csi-hostpathplugin-lqf85\" (UID: \"599d27ce-56f1-452b-bb67-2ad0178c9d61\") " pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.161940 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.169238 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.178566 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.182771 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8zt\" (UniqueName: \"kubernetes.io/projected/5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e-kube-api-access-7r8zt\") pod \"ingress-canary-rwwsd\" (UID: \"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e\") " pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.186106 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.186780 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.208695 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.213811 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.244987 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.245034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.247157 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.247620 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.747608966 +0000 UTC m=+246.212596287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.255562 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ba7ca967-58f7-4944-81d8-7bb8957707ad-metrics-certs\") pod \"network-metrics-daemon-qpkbn\" (UID: \"ba7ca967-58f7-4944-81d8-7bb8957707ad\") " pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.261035 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9nc6\" (UniqueName: \"kubernetes.io/projected/7c5a4f3a-15c2-4513-9f96-b24e96e51cc7-kube-api-access-p9nc6\") pod \"olm-operator-6b444d44fb-l7jfp\" (UID: \"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.261946 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhp2c\" (UniqueName: \"kubernetes.io/projected/c1ebf6ab-7086-429d-9b03-fdc278235e3b-kube-api-access-hhp2c\") pod \"ingress-operator-5b745b69d9-5frsf\" (UID: \"c1ebf6ab-7086-429d-9b03-fdc278235e3b\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.262346 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.264354 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9jvc\" (UniqueName: \"kubernetes.io/projected/28d25115-1b3b-4b18-9fa9-df0b7ba49a22-kube-api-access-n9jvc\") pod \"machine-config-server-rm5z9\" (UID: \"28d25115-1b3b-4b18-9fa9-df0b7ba49a22\") " pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.285019 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbf4\" (UniqueName: \"kubernetes.io/projected/4a8c4e0e-803d-4706-a411-669e05829f9c-kube-api-access-fkbf4\") pod \"dns-default-lmc29\" (UID: \"4a8c4e0e-803d-4706-a411-669e05829f9c\") " pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.287216 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.289399 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rwwsd" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.298068 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-rm5z9" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.310755 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.318901 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-vm4bq"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.320336 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-wj9wt"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.321130 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5559595-a2f4-43ac-903f-9b9e418bf435-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-hbllp\" (UID: \"c5559595-a2f4-43ac-903f-9b9e418bf435\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.333412 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.338368 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" podStartSLOduration=175.33834179 podStartE2EDuration="2m55.33834179s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:47.332051653 +0000 UTC m=+245.797038964" watchObservedRunningTime="2026-03-07 07:03:47.33834179 +0000 UTC m=+245.803329111" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.346759 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.347407 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.84739054 +0000 UTC m=+246.312377861 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.418753 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7vtnd"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.452202 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.452992 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.453673 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:47.953648605 +0000 UTC m=+246.418635926 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.503561 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-qpkbn" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.554510 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.554858 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.554977 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.054934368 +0000 UTC m=+246.519921689 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.555832 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.556289 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.056269533 +0000 UTC m=+246.521256854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.614545 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" event={"ID":"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6","Type":"ContainerStarted","Data":"0bc679a499997658464649ddcdaa75763ffd3db7296f3993f6b95949301a32d5"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.626564 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" event={"ID":"6b20a433-e331-4429-9cc9-041080de016b","Type":"ContainerStarted","Data":"c2a626d91a082483b5bb59a2b80ea7a7c3a55f1b31184e846c559196e2c0901e"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.628461 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" event={"ID":"cd2b6e69-054d-461b-8a1d-ca38261a83d3","Type":"ContainerStarted","Data":"c207b0c05bee3628dda5debb82d9227fe849e435dcdb820d142fdfc3d847f9b6"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.636636 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" event={"ID":"ef85004e-b70b-4eb0-9551-9436944618dc","Type":"ContainerStarted","Data":"ec5e767af5e5633a6efaffed5201fc8805db23d933bb41210776d81bd0399125"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.647527 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-k9g6h" event={"ID":"f752fd6d-5074-4f1f-ae2e-f1c4225536f6","Type":"ContainerStarted","Data":"fb7557d40d5e11ee5cc63292f2b782b7f8e7ac8e70e2db686f146089948219dc"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.649371 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" event={"ID":"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63","Type":"ContainerStarted","Data":"b02376bcd4e1995082fae12c067be05db7a7508baf6d2d3cadb6101ebe5f5751"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.658357 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.659181 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.159135508 +0000 UTC m=+246.624122829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.668647 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" event={"ID":"146ee606-df28-4fb2-accf-391a835c0cf2","Type":"ContainerStarted","Data":"09de265f3134c3ecf7140f286f0ec2747e506b5e829e25d6913473b81a12c13e"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.671954 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" event={"ID":"7ac92454-dfca-4c2f-aa74-a1226bfd9f66","Type":"ContainerStarted","Data":"17e43fe46c5e450d0b7d80c992a84b9f4115e234636e1a5f88d8123ddf7ca14d"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.680138 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" event={"ID":"de384384-33aa-408f-a998-bf6b3e79e0a8","Type":"ContainerStarted","Data":"3306a7cdbabdf9b67028886499a0446af01ae2088c24a1b24093ad81e357486b"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.680214 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" event={"ID":"de384384-33aa-408f-a998-bf6b3e79e0a8","Type":"ContainerStarted","Data":"21f82e32d0907b248f54b1faf13e29348998b32a9ea42548800d90ea369dab18"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.700654 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.706026 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.732594 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" event={"ID":"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef","Type":"ContainerStarted","Data":"4a7d9a92eb47a217d0d4239cc7528323b3db23bc61ace26ca1b014123fce044f"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.772468 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.773288 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.273243621 +0000 UTC m=+246.738230942 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.792830 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wwk6z" event={"ID":"3d9a181b-98f6-4420-90fb-6df876c703a0","Type":"ContainerStarted","Data":"104ea2409262caa034d94af0cad63cc596cfc361e39857a7d5f06ad889b39b92"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.806811 4738 generic.go:334] "Generic (PLEG): container finished" podID="9b53e2fb-773e-4c0e-a251-c0fdc80b66f8" containerID="3206529e6ab43c561bfd4e4d226edf79591722756cb89780dfffc47752155b65" exitCode=0 Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.808011 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" event={"ID":"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8","Type":"ContainerDied","Data":"3206529e6ab43c561bfd4e4d226edf79591722756cb89780dfffc47752155b65"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.814292 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvllr" event={"ID":"e863c889-47c2-459d-a84f-dc360fe3098f","Type":"ContainerStarted","Data":"ed45d960a1ac8b275caff15d4b5167091d619068b788979f6252f2a987ad32ff"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.814329 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-rvllr" event={"ID":"e863c889-47c2-459d-a84f-dc360fe3098f","Type":"ContainerStarted","Data":"c3aa04d8b7a04873e7a4e065ec72707242cafef470628ff7d9313bdc5880ae16"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.815463 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z"] Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.835391 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" event={"ID":"8dbe9f44-a77f-4664-8076-edfeba6967ec","Type":"ContainerStarted","Data":"ea2b162aede0d7a6683d9c1e5f28fafd6162c59e94cf87479e60c4a335c0506c"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.841056 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" event={"ID":"6a49d6b8-36d5-48de-90fc-1fcd8f172eab","Type":"ContainerStarted","Data":"f783cf5faa2b6533a790be745144d1fd4b42433f45ed06ee4d4f2ca7698c5eec"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.842379 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" event={"ID":"62300cba-ef24-4937-90da-6e210a59bc66","Type":"ContainerStarted","Data":"c6a7ea19cc0108f2b8a74d9017aefaa04adc5d08f9a5b3f74ea97559e3109687"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.843907 4738 generic.go:334] "Generic (PLEG): container finished" podID="a89ff68b-6456-4756-b991-5ab40a3f3dfe" containerID="d42033d0c580f2bd921d5f5517be0bd62b00bc740689ca05d58a678eccca1dbf" exitCode=0 Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.843983 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" event={"ID":"a89ff68b-6456-4756-b991-5ab40a3f3dfe","Type":"ContainerDied","Data":"d42033d0c580f2bd921d5f5517be0bd62b00bc740689ca05d58a678eccca1dbf"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.845964 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" event={"ID":"a268242a-d0f5-4e03-9d87-cb5a26701223","Type":"ContainerStarted","Data":"2ef7689ba3e0d344c14e02f2cbfc24658300ec056e0ca72f7ce76b1705ea1a7a"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.850520 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" event={"ID":"2326371b-8e91-4a56-b615-5b98c7f93b79","Type":"ContainerStarted","Data":"9beaab33f3698138ddb02afa28398e34c87efed4308d1fae8a67923173d149b1"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.854528 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" event={"ID":"469adfed-daf6-48e4-9c59-c6823e3a4143","Type":"ContainerStarted","Data":"fc20f8f072f2f7963265236688445537df1a78c38312be843006b9302926de6b"} Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.854557 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.858173 4738 patch_prober.go:28] interesting pod/console-operator-58897d9998-w2cw7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.858222 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" podUID="469adfed-daf6-48e4-9c59-c6823e3a4143" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.873378 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.873748 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.373728083 +0000 UTC m=+246.838715404 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:47 crc kubenswrapper[4738]: I0307 07:03:47.979347 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:47 crc kubenswrapper[4738]: E0307 07:03:47.986309 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.486283945 +0000 UTC m=+246.951271266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.081383 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.081552 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.581519428 +0000 UTC m=+247.046506749 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.082011 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.082525 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.582517794 +0000 UTC m=+247.047505115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.082953 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.084519 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.184975 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.185332 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.685267746 +0000 UTC m=+247.150255067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.185646 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.185993 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.685979225 +0000 UTC m=+247.150966546 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: W0307 07:03:48.228726 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4420b34_f92a_44fa_bd21_edbbc50d3e09.slice/crio-f25f3abedc667f0e6927d67c085b6607e1416b649455a84ad66079b254e93375 WatchSource:0}: Error finding container f25f3abedc667f0e6927d67c085b6607e1416b649455a84ad66079b254e93375: Status 404 returned error can't find the container with id f25f3abedc667f0e6927d67c085b6607e1416b649455a84ad66079b254e93375 Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.286409 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.286820 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.786799126 +0000 UTC m=+247.251786447 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.288591 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.331617 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.345548 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.347746 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-kblzd"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.388449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.388886 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.888869691 +0000 UTC m=+247.353857002 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: W0307 07:03:48.457048 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e1d9fee_954d_417a_8bba_28710f7a4bfa.slice/crio-1b1e2286b3e973cfada6ddeaace8e764618801ae249706fe874ce1a3420174be WatchSource:0}: Error finding container 1b1e2286b3e973cfada6ddeaace8e764618801ae249706fe874ce1a3420174be: Status 404 returned error can't find the container with id 1b1e2286b3e973cfada6ddeaace8e764618801ae249706fe874ce1a3420174be Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.496932 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.497620 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:48.99758626 +0000 UTC m=+247.462573771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.517455 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:48 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:48 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:48 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.517520 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.538804 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.549256 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.599292 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.600183 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.100150327 +0000 UTC m=+247.565137648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: W0307 07:03:48.671453 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4b70020_a507_4a1c_a80c_7070c9c4b1ab.slice/crio-62ea9071e48a7322056f7c7095c3ef7c685c0f85561ed461fc75ebeeb997f3de WatchSource:0}: Error finding container 62ea9071e48a7322056f7c7095c3ef7c685c0f85561ed461fc75ebeeb997f3de: Status 404 returned error can't find the container with id 62ea9071e48a7322056f7c7095c3ef7c685c0f85561ed461fc75ebeeb997f3de Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.701777 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.702256 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.202214702 +0000 UTC m=+247.667202023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.803623 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.804237 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.304219184 +0000 UTC m=+247.769206505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.854772 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-k9g6h" podStartSLOduration=176.854752162 podStartE2EDuration="2m56.854752162s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:48.853670224 +0000 UTC m=+247.318657545" watchObservedRunningTime="2026-03-07 07:03:48.854752162 +0000 UTC m=+247.319739483" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.882012 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-lmc29"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.893631 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.900536 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" event={"ID":"a0826ee3-cdd2-4c67-9b92-4a33512e6f13","Type":"ContainerStarted","Data":"f5294518f96c501ed15b3e3ffa4a306267d66f4f17ca250c9391b5d0c239df24"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.905027 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:48 crc kubenswrapper[4738]: E0307 07:03:48.905535 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.405519417 +0000 UTC m=+247.870506738 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.917718 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" event={"ID":"c4420b34-f92a-44fa-bd21-edbbc50d3e09","Type":"ContainerStarted","Data":"f25f3abedc667f0e6927d67c085b6607e1416b649455a84ad66079b254e93375"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.919817 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-wwk6z" event={"ID":"3d9a181b-98f6-4420-90fb-6df876c703a0","Type":"ContainerStarted","Data":"e7a4211eac61413c0824b42ef7b56b48d63efa6eb3ddd350d8a791a41dd5dd91"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.920415 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.929336 4738 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwk6z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.929412 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwk6z" podUID="3d9a181b-98f6-4420-90fb-6df876c703a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.935012 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" event={"ID":"b4b70020-a507-4a1c-a80c-7070c9c4b1ab","Type":"ContainerStarted","Data":"62ea9071e48a7322056f7c7095c3ef7c685c0f85561ed461fc75ebeeb997f3de"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.945607 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-qtvn2" podStartSLOduration=176.945582438 podStartE2EDuration="2m56.945582438s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:48.936830986 +0000 UTC m=+247.401818307" watchObservedRunningTime="2026-03-07 07:03:48.945582438 +0000 UTC m=+247.410569759" Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.945880 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6"] Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.956927 4738 generic.go:334] "Generic (PLEG): container finished" podID="62300cba-ef24-4937-90da-6e210a59bc66" containerID="c502274b8e4172e635962554f5ec20af10fbbda56ba3d99b50aa9e60b644ff16" exitCode=0 Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.957022 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" event={"ID":"62300cba-ef24-4937-90da-6e210a59bc66","Type":"ContainerDied","Data":"c502274b8e4172e635962554f5ec20af10fbbda56ba3d99b50aa9e60b644ff16"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.968292 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" event={"ID":"82776d3a-b1d3-47f5-8c1a-c84a2ef8bfef","Type":"ContainerStarted","Data":"5096f3d44dd06a069fd1ad7e2d266bc88fd2b9313d95117c53a4f28130920f0d"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.977642 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" podStartSLOduration=177.977617447 podStartE2EDuration="2m57.977617447s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:48.973360004 +0000 UTC m=+247.438347325" watchObservedRunningTime="2026-03-07 07:03:48.977617447 +0000 UTC m=+247.442604768" Mar 07 07:03:48 crc kubenswrapper[4738]: W0307 07:03:48.985990 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d86bd56_4321_49e4_9a52_8591e1060490.slice/crio-160fee14f094e911548c2f621bf8ad4ee4940d301ef3c617a845f069c1247020 WatchSource:0}: Error finding container 160fee14f094e911548c2f621bf8ad4ee4940d301ef3c617a845f069c1247020: Status 404 returned error can't find the container with id 160fee14f094e911548c2f621bf8ad4ee4940d301ef3c617a845f069c1247020 Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.997113 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" event={"ID":"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63","Type":"ContainerStarted","Data":"6b8f3cd0fd8b2b09c56713b271241d6cadb9b9254ce768ff888e2dc762b84d95"} Mar 07 07:03:48 crc kubenswrapper[4738]: I0307 07:03:48.998834 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" event={"ID":"00b539ba-07e8-41d9-a8fb-b6b43d051edc","Type":"ContainerStarted","Data":"b002c839d1f9ef74acc3a87e9d85ffd702e5b22becb0b2c1c863c0e1f29db8b7"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.007059 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.007491 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.507474379 +0000 UTC m=+247.972461700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.030320 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" event={"ID":"7ac92454-dfca-4c2f-aa74-a1226bfd9f66","Type":"ContainerStarted","Data":"f40d006ae826a6499bf0aa053008854e376efc0035ebf6e59957ec3b1f3939f1"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.039782 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" event={"ID":"892836f4-a16e-4521-a239-ee48bfc834d2","Type":"ContainerStarted","Data":"265dd6b9bb601b8889e776992f669d525c7d548e5ee8a16eecfdbcb15ffdaaf9"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.079143 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" podStartSLOduration=178.079122006 podStartE2EDuration="2m58.079122006s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.075593353 +0000 UTC m=+247.540580694" watchObservedRunningTime="2026-03-07 07:03:49.079122006 +0000 UTC m=+247.544109327" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.079866 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" event={"ID":"ad2266fc-521b-461f-ad3e-ef33b89c23d4","Type":"ContainerStarted","Data":"b2c3baebcaecd93e265a99404b6f61ba74055991fc8e6d8ad53610ce2cecc5e2"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.095252 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" event={"ID":"469adfed-daf6-48e4-9c59-c6823e3a4143","Type":"ContainerStarted","Data":"32bc418128f84f9caa16509180147cc4e3338adbc45cb7642e2554f64f4b388c"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.099139 4738 patch_prober.go:28] interesting pod/console-operator-58897d9998-w2cw7 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.099224 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" podUID="469adfed-daf6-48e4-9c59-c6823e3a4143" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.7:8443/readyz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.108954 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-rvllr" podStartSLOduration=178.108924116 podStartE2EDuration="2m58.108924116s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.024972892 +0000 UTC m=+247.489960223" watchObservedRunningTime="2026-03-07 07:03:49.108924116 +0000 UTC m=+247.573911437" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.118690 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.118947 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.61892357 +0000 UTC m=+248.083910891 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.122692 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" event={"ID":"146ee606-df28-4fb2-accf-391a835c0cf2","Type":"ContainerStarted","Data":"13ca65fc971e1e07bd79cd5cf5fea2c25be276d666a316eadcdc8c2564cbd003"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.125279 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rwwsd"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.126008 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rm5z9" event={"ID":"28d25115-1b3b-4b18-9fa9-df0b7ba49a22","Type":"ContainerStarted","Data":"a964fe411a9dd8e28e826f3b10a67d949543125d6e1523bd6e8c21a3ec1bead9"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.126993 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" event={"ID":"2e1d9fee-954d-417a-8bba-28710f7a4bfa","Type":"ContainerStarted","Data":"1b1e2286b3e973cfada6ddeaace8e764618801ae249706fe874ce1a3420174be"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.129265 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" event={"ID":"ecc54fff-ab62-4cf3-ae4d-0c25951498c8","Type":"ContainerStarted","Data":"8a69d4f4362ce4d81cb0116c8a16434905f4650803730c424e2a62949545dcc5"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.131070 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.144698 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.148817 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.648800802 +0000 UTC m=+248.113788123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.149688 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-mrbc5"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.149918 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" event={"ID":"6b20a433-e331-4429-9cc9-041080de016b","Type":"ContainerStarted","Data":"93c504a1c81f3437810b0d99be8d71c64bd112d36510226f5c52639f5b731a7d"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.153626 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mfdxf" podStartSLOduration=178.15360733 podStartE2EDuration="2m58.15360733s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.113053105 +0000 UTC m=+247.578040436" watchObservedRunningTime="2026-03-07 07:03:49.15360733 +0000 UTC m=+247.618594661" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.157908 4738 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-p7l6z container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.157986 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" podUID="ecc54fff-ab62-4cf3-ae4d-0c25951498c8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.185113 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lqf85"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.186823 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" event={"ID":"429c8ef3-3ebe-458d-a6fd-75f52e58c540","Type":"ContainerStarted","Data":"5dd13b10f0b5a53b34f41028c22cc16b0648709af39b045c56a1ea3dced09ddf"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.187518 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-qpkbn"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.199861 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" event={"ID":"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc","Type":"ContainerStarted","Data":"f1c44c3a2e101aa6aa6f78141d0e0c78c5231986b63f64bcf4884107a76effcb"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.203086 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.240945 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" event={"ID":"de384384-33aa-408f-a998-bf6b3e79e0a8","Type":"ContainerStarted","Data":"ff7d8898cbb8edcee5ea3a59d33df3d493f463ebd79ffe70abad73ab95dc12b9"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.258720 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.260715 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.261458 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.761425556 +0000 UTC m=+248.226412877 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.262687 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-8xdj4" podStartSLOduration=177.262666599 podStartE2EDuration="2m57.262666599s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.260512182 +0000 UTC m=+247.725499503" watchObservedRunningTime="2026-03-07 07:03:49.262666599 +0000 UTC m=+247.727653920" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.270479 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" event={"ID":"32a48345-2d7c-4d3b-9bc7-d8e20537e3c6","Type":"ContainerStarted","Data":"2df94b49fd76a70a7d0bebd4c7931287b19ad5d08e5e28727380df292441da7e"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.304451 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-cc5hn" podStartSLOduration=177.304430836 podStartE2EDuration="2m57.304430836s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.301788296 +0000 UTC m=+247.766775617" watchObservedRunningTime="2026-03-07 07:03:49.304430836 +0000 UTC m=+247.769418157" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.307602 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-dflfd" event={"ID":"8dbe9f44-a77f-4664-8076-edfeba6967ec","Type":"ContainerStarted","Data":"34dd2ad8b6b000fbf7953672111e52679e152bd8335b3e638e01ef0b348a1070"} Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.328529 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" podStartSLOduration=177.328505153 podStartE2EDuration="2m57.328505153s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.325859433 +0000 UTC m=+247.790846774" watchObservedRunningTime="2026-03-07 07:03:49.328505153 +0000 UTC m=+247.793492474" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.361277 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.361609 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.86159437 +0000 UTC m=+248.326581701 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.371433 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-wwk6z" podStartSLOduration=178.371409659 podStartE2EDuration="2m58.371409659s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.371285116 +0000 UTC m=+247.836272447" watchObservedRunningTime="2026-03-07 07:03:49.371409659 +0000 UTC m=+247.836396980" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.391187 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp"] Mar 07 07:03:49 crc kubenswrapper[4738]: W0307 07:03:49.409710 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5559595_a2f4_43ac_903f_9b9e418bf435.slice/crio-903a253f4b3b604acb800dc7302be10c148dc2a7440bd4f2e4ba73902eb687c5 WatchSource:0}: Error finding container 903a253f4b3b604acb800dc7302be10c148dc2a7440bd4f2e4ba73902eb687c5: Status 404 returned error can't find the container with id 903a253f4b3b604acb800dc7302be10c148dc2a7440bd4f2e4ba73902eb687c5 Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.412120 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.464601 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.464989 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:49.964972569 +0000 UTC m=+248.429959890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.466250 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-f9m4m" podStartSLOduration=177.466226301 podStartE2EDuration="2m57.466226301s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.428585335 +0000 UTC m=+247.893572666" watchObservedRunningTime="2026-03-07 07:03:49.466226301 +0000 UTC m=+247.931213612" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.468033 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb"] Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.468104 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-vm4bq" podStartSLOduration=177.468092781 podStartE2EDuration="2m57.468092781s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.464135477 +0000 UTC m=+247.929122798" watchObservedRunningTime="2026-03-07 07:03:49.468092781 +0000 UTC m=+247.933080102" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.478840 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:49 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:49 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:49 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.478914 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.568071 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.568634 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.068615534 +0000 UTC m=+248.533602855 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.669785 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.669950 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.169916248 +0000 UTC m=+248.634903579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.670623 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.670991 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.170981046 +0000 UTC m=+248.635968367 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.771739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.771886 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.271867229 +0000 UTC m=+248.736854550 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.771953 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.772374 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.272364353 +0000 UTC m=+248.737351674 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.872870 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.872957 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.372939977 +0000 UTC m=+248.837927288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.873104 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.873378 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.373370469 +0000 UTC m=+248.838357780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:49 crc kubenswrapper[4738]: I0307 07:03:49.974121 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:49 crc kubenswrapper[4738]: E0307 07:03:49.974593 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.47457793 +0000 UTC m=+248.939565251 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.076858 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.077351 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.577331372 +0000 UTC m=+249.042318693 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.179617 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.181147 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.68107858 +0000 UTC m=+249.146065901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.282272 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.282709 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.782696533 +0000 UTC m=+249.247683854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.320888 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-rm5z9" event={"ID":"28d25115-1b3b-4b18-9fa9-df0b7ba49a22","Type":"ContainerStarted","Data":"7fdbb6c8aae2cb630f055b0648d13ee39949b42fc4239f514f8b5b0e934d1d43"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.333858 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" event={"ID":"7ac92454-dfca-4c2f-aa74-a1226bfd9f66","Type":"ContainerStarted","Data":"57e8600e9dea2b50b70f609bea9f2ca62a802672928c6605125bfc3dc235148e"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.337454 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-rm5z9" podStartSLOduration=6.337434363 podStartE2EDuration="6.337434363s" podCreationTimestamp="2026-03-07 07:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.337081604 +0000 UTC m=+248.802068925" watchObservedRunningTime="2026-03-07 07:03:50.337434363 +0000 UTC m=+248.802421694" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.338502 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wjdr5" podStartSLOduration=178.338493612 podStartE2EDuration="2m58.338493612s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:49.501228989 +0000 UTC m=+247.966216320" watchObservedRunningTime="2026-03-07 07:03:50.338493612 +0000 UTC m=+248.803480923" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.341585 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" event={"ID":"429c8ef3-3ebe-458d-a6fd-75f52e58c540","Type":"ContainerStarted","Data":"71431b39e49a30aa58cde99c1a3ed7be112f9b4661f6e01e92822e86b6c4d920"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.365549 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-4nw58" podStartSLOduration=178.365530987 podStartE2EDuration="2m58.365530987s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.365489366 +0000 UTC m=+248.830476687" watchObservedRunningTime="2026-03-07 07:03:50.365530987 +0000 UTC m=+248.830518318" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.366804 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" event={"ID":"62300cba-ef24-4937-90da-6e210a59bc66","Type":"ContainerStarted","Data":"730972eda1af98ee6e426dc8ce294f9962cb74e4abccf235db8c74b03959add3"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.372402 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" event={"ID":"e1528346-60ec-4879-81b0-72027f1d1477","Type":"ContainerStarted","Data":"6665d6e29556438d724045bb4a48c66b142d07234722dc3ccf1fd38a073fa84b"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.384792 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" event={"ID":"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc","Type":"ContainerStarted","Data":"0630bea3371a28de009e9ab4e1f64a0ceb86668c7a022aa74187c40010202ef7"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.384852 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.386833 4738 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-67thp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.386897 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.388247 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.392059 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.89202891 +0000 UTC m=+249.357016231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.392907 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-kblzd" podStartSLOduration=178.392893422 podStartE2EDuration="2m58.392893422s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.391603219 +0000 UTC m=+248.856590540" watchObservedRunningTime="2026-03-07 07:03:50.392893422 +0000 UTC m=+248.857880743" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.399896 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" event={"ID":"c5559595-a2f4-43ac-903f-9b9e418bf435","Type":"ContainerStarted","Data":"903a253f4b3b604acb800dc7302be10c148dc2a7440bd4f2e4ba73902eb687c5"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.406489 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" event={"ID":"b4b70020-a507-4a1c-a80c-7070c9c4b1ab","Type":"ContainerStarted","Data":"48714dae1f4b3297b58b1c924badc9fe41c2cb8664abc9f17922b6851a1673f4"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.456019 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" event={"ID":"09f978c5-7fd4-4852-95c4-915304c1bf18","Type":"ContainerStarted","Data":"60a375b1357d468345d00821cfa7fac44badfc3c83b95c00f04fc35f5d8ad0d9"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.463096 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" event={"ID":"c4420b34-f92a-44fa-bd21-edbbc50d3e09","Type":"ContainerStarted","Data":"d44d4117a324d0428c08f226f39790a3e2cdb48cb197496c62fa4cd105fd673e"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.466450 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" event={"ID":"ef85004e-b70b-4eb0-9551-9436944618dc","Type":"ContainerStarted","Data":"55a79607aecf200586d9699126612fd39a77321ab548e186ef2d17928b132b77"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.466550 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:50 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:50 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:50 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.466591 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.469885 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmc29" event={"ID":"4a8c4e0e-803d-4706-a411-669e05829f9c","Type":"ContainerStarted","Data":"12f90082bf4b64ce3139a5b427f992ff8ea2c11825f23a1d14ba3de1558e01a9"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.472864 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" event={"ID":"cd2b6e69-054d-461b-8a1d-ca38261a83d3","Type":"ContainerStarted","Data":"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.473463 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.474962 4738 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7vtnd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.475009 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.475841 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rwwsd" event={"ID":"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e","Type":"ContainerStarted","Data":"8f096a2aa40afabefdd274241b6d06e4d33bfbec1313e95fae08b3672203a0e2"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.479914 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" event={"ID":"6a3d09cb-0a23-48ff-9e8c-83ca39d2ff63","Type":"ContainerStarted","Data":"dd91e397fafcb6ed3d004d622df6e1b557d378c204eb2b27c304bb959fcc5298"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.493810 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.494712 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" event={"ID":"2e1d9fee-954d-417a-8bba-28710f7a4bfa","Type":"ContainerStarted","Data":"87b37afbd12810e6802dc5fb03d6a8f70c528ec72882da310cb62a94294a3d43"} Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.497996 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:50.997978037 +0000 UTC m=+249.462965358 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.499915 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" event={"ID":"ba7ca967-58f7-4944-81d8-7bb8957707ad","Type":"ContainerStarted","Data":"f7cf4e6aa4b1df3a56bb7071d10f137c370a73d6b3fef7f9506b4f92c3e7810d"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.502443 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" event={"ID":"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7","Type":"ContainerStarted","Data":"72ce9f64084b925f0d0f9c3752dc4abf56a08d5bbfff148f239f3ce09693e03f"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.503309 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-7hwl9" podStartSLOduration=178.503280077 podStartE2EDuration="2m58.503280077s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.49321278 +0000 UTC m=+248.958200111" watchObservedRunningTime="2026-03-07 07:03:50.503280077 +0000 UTC m=+248.968267398" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.503431 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" podStartSLOduration=178.503425681 podStartE2EDuration="2m58.503425681s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.415806619 +0000 UTC m=+248.880793940" watchObservedRunningTime="2026-03-07 07:03:50.503425681 +0000 UTC m=+248.968413002" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.507385 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" event={"ID":"c1ebf6ab-7086-429d-9b03-fdc278235e3b","Type":"ContainerStarted","Data":"fc853dab8b08baaa3813fd28ceee73d977f19ab15435937a6636ebcbdba85ffd"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.523441 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" event={"ID":"00b539ba-07e8-41d9-a8fb-b6b43d051edc","Type":"ContainerStarted","Data":"13be78f9c0c9757107aa9053d1474cb9c968742a9a38c45106915e7226ec5cca"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.530101 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" podStartSLOduration=179.530079497 podStartE2EDuration="2m59.530079497s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.528883906 +0000 UTC m=+248.993871237" watchObservedRunningTime="2026-03-07 07:03:50.530079497 +0000 UTC m=+248.995066818" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.531294 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" event={"ID":"ecc54fff-ab62-4cf3-ae4d-0c25951498c8","Type":"ContainerStarted","Data":"cca202cfef7d5099dddc3ef6e4381b3eb057b42e1b37e05d875a74fd357ebc63"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.536837 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" event={"ID":"599d27ce-56f1-452b-bb67-2ad0178c9d61","Type":"ContainerStarted","Data":"d67b04f6cddc7342ab657c5b6938bbbf5393784421cc0f89adb82d0b4a316791"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.538795 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" event={"ID":"892836f4-a16e-4521-a239-ee48bfc834d2","Type":"ContainerStarted","Data":"c53cfefdabbcacff19b56c9b091503e563efdb888223ac64983dd8dc7d32b66f"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.539872 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" event={"ID":"2d86bd56-4321-49e4-9a52-8591e1060490","Type":"ContainerStarted","Data":"160fee14f094e911548c2f621bf8ad4ee4940d301ef3c617a845f069c1247020"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.543897 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" event={"ID":"9b53e2fb-773e-4c0e-a251-c0fdc80b66f8","Type":"ContainerStarted","Data":"16b55932061cef105015affe7adeafa88a1de86da9e467613e6cc4844f3bfdfd"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.549376 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4grdb" podStartSLOduration=179.549355118 podStartE2EDuration="2m59.549355118s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.547396356 +0000 UTC m=+249.012383687" watchObservedRunningTime="2026-03-07 07:03:50.549355118 +0000 UTC m=+249.014342439" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.553909 4738 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-p7l6z container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.553993 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" podUID="ecc54fff-ab62-4cf3-ae4d-0c25951498c8" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.555322 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" event={"ID":"a89ff68b-6456-4756-b991-5ab40a3f3dfe","Type":"ContainerStarted","Data":"a48267293643a9feb911350305b6e08d7326f47322070a9cfd19cd91af5e3713"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.563929 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" event={"ID":"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c","Type":"ContainerStarted","Data":"29312a4fba322cff17d6aeaf3eb2305a8c13e654fa880e5d917d13344daa6352"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.577869 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-gptxv" podStartSLOduration=178.577850163 podStartE2EDuration="2m58.577850163s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.564353105 +0000 UTC m=+249.029340426" watchObservedRunningTime="2026-03-07 07:03:50.577850163 +0000 UTC m=+249.042837484" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.581714 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" event={"ID":"a0826ee3-cdd2-4c67-9b92-4a33512e6f13","Type":"ContainerStarted","Data":"f1372dbd623bc91f30ae7114295fb78b485f7f05867f3909551c437ee59b2883"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.583671 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" event={"ID":"a268242a-d0f5-4e03-9d87-cb5a26701223","Type":"ContainerStarted","Data":"101fb52f27574dd1c0a73b504a02453706df3f7794472f2a174103cef41b01ce"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.587484 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" event={"ID":"ad2266fc-521b-461f-ad3e-ef33b89c23d4","Type":"ContainerStarted","Data":"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691"} Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.588379 4738 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwk6z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.588426 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwk6z" podUID="3d9a181b-98f6-4420-90fb-6df876c703a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.594663 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.594753 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.09473886 +0000 UTC m=+249.559726181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.594641 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" podStartSLOduration=179.594625787 podStartE2EDuration="2m59.594625787s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.591192406 +0000 UTC m=+249.056179717" watchObservedRunningTime="2026-03-07 07:03:50.594625787 +0000 UTC m=+249.059613108" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.594894 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.596544 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.096535778 +0000 UTC m=+249.561523109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.629051 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-w2cw7" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.647372 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" podStartSLOduration=178.647352614 podStartE2EDuration="2m58.647352614s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:50.613752154 +0000 UTC m=+249.078739475" watchObservedRunningTime="2026-03-07 07:03:50.647352614 +0000 UTC m=+249.112339935" Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.696723 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.697437 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.19741041 +0000 UTC m=+249.662397731 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.697919 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.712077 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.212056058 +0000 UTC m=+249.677043379 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.813546 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.814455 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.314430441 +0000 UTC m=+249.779417762 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:50 crc kubenswrapper[4738]: I0307 07:03:50.916238 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:50 crc kubenswrapper[4738]: E0307 07:03:50.916990 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.416976077 +0000 UTC m=+249.881963398 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.017404 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.017533 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.51751401 +0000 UTC m=+249.982501331 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.018103 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.018592 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.518569968 +0000 UTC m=+249.983557289 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.119396 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.119872 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.619853802 +0000 UTC m=+250.084841123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.221322 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.221733 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.72171539 +0000 UTC m=+250.186702711 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.324578 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.325638 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.825614503 +0000 UTC m=+250.290601824 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.426846 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.427522 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:51.927498861 +0000 UTC m=+250.392486182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.463142 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:51 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:51 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:51 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.463229 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.528179 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.528555 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.028538148 +0000 UTC m=+250.493525469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.599973 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" event={"ID":"c5559595-a2f4-43ac-903f-9b9e418bf435","Type":"ContainerStarted","Data":"0b0abdfc22add14bcc2db029feacc4b0e06c19bfb40d36b5613974739c331257"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.614965 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" event={"ID":"2d86bd56-4321-49e4-9a52-8591e1060490","Type":"ContainerStarted","Data":"06eea259662ee15a8a2df4d83014f113ab0e19cabc9d05533c7b63da2d1facc4"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.615014 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" event={"ID":"2d86bd56-4321-49e4-9a52-8591e1060490","Type":"ContainerStarted","Data":"4f4039ef64f33fd7479f5d374ce7ae23cfc6905338374c26dfb85d10492089ee"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.621912 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" event={"ID":"a268242a-d0f5-4e03-9d87-cb5a26701223","Type":"ContainerStarted","Data":"3ecdd6290b4ab3e2888cd1ccd7643a243d11ace414f82baeb64f67b10aa28cf5"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.621975 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-hbllp" podStartSLOduration=179.621960054 podStartE2EDuration="2m59.621960054s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.621239065 +0000 UTC m=+250.086226386" watchObservedRunningTime="2026-03-07 07:03:51.621960054 +0000 UTC m=+250.086947375" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.630315 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.630625 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.130613723 +0000 UTC m=+250.595601044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.641510 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" event={"ID":"ba7ca967-58f7-4944-81d8-7bb8957707ad","Type":"ContainerStarted","Data":"d8eca9fcfe89268f7b6a6d400562444675b70a50bc8e0fdc86e2da780865a521"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.641898 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-qpkbn" event={"ID":"ba7ca967-58f7-4944-81d8-7bb8957707ad","Type":"ContainerStarted","Data":"62dfaf84f4d58bb9d560c891a1f20970517cf2e9818b6745bc2a66c7528b5464"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.649063 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-52lhg" podStartSLOduration=179.649049171 podStartE2EDuration="2m59.649049171s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.648484496 +0000 UTC m=+250.113471817" watchObservedRunningTime="2026-03-07 07:03:51.649049171 +0000 UTC m=+250.114036492" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.655642 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" event={"ID":"00b539ba-07e8-41d9-a8fb-b6b43d051edc","Type":"ContainerStarted","Data":"29c0cf34c564bfb0bdd8dab259100f2e315603376e234280f4455b751b78511e"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.656888 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.673244 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" event={"ID":"ef85004e-b70b-4eb0-9551-9436944618dc","Type":"ContainerStarted","Data":"9cb2b6e5ad8d521855b01cfe34d0b530386960c3bc230a8d5ddecb8c41fb1b75"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.684835 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-qpkbn" podStartSLOduration=180.684816349 podStartE2EDuration="3m0.684816349s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.681647435 +0000 UTC m=+250.146634756" watchObservedRunningTime="2026-03-07 07:03:51.684816349 +0000 UTC m=+250.149803670" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.696545 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rwwsd" event={"ID":"5437e5e3-4ef3-4a8b-bcf4-e6ad524ebc5e","Type":"ContainerStarted","Data":"756ad79d3329815705a3719ee1c5d62010b6eb2a8f13b8f0c76231b97ba78ea9"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.706487 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" event={"ID":"a0826ee3-cdd2-4c67-9b92-4a33512e6f13","Type":"ContainerStarted","Data":"23965d012a74ed375c663a956037f0b828269ac1adddb5fc984192448da6989a"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.710057 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-wj9wt" podStartSLOduration=179.710041707 podStartE2EDuration="2m59.710041707s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.708667531 +0000 UTC m=+250.173654852" watchObservedRunningTime="2026-03-07 07:03:51.710041707 +0000 UTC m=+250.175029028" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.729108 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" event={"ID":"2e1d9fee-954d-417a-8bba-28710f7a4bfa","Type":"ContainerStarted","Data":"3c03872d757456cde8f1faa42df9664f2a72a35f166bc6992757e34d687a7cd4"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.733739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.735439 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.235417179 +0000 UTC m=+250.700404510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.743107 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-xj55k" podStartSLOduration=180.743088423 podStartE2EDuration="3m0.743088423s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.739575 +0000 UTC m=+250.204562331" watchObservedRunningTime="2026-03-07 07:03:51.743088423 +0000 UTC m=+250.208075744" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.748481 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmc29" event={"ID":"4a8c4e0e-803d-4706-a411-669e05829f9c","Type":"ContainerStarted","Data":"0858d92fd3b163a09a01697a3ccf915b3916f7d405d48447dd5888d804014366"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.748561 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-lmc29" event={"ID":"4a8c4e0e-803d-4706-a411-669e05829f9c","Type":"ContainerStarted","Data":"b0ead578f7f21fd4824fdcd20acb1246a5a426bd01ff26cb7a90c1f62134577d"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.749241 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-lmc29" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.757478 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" event={"ID":"2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c","Type":"ContainerStarted","Data":"efc0055c49a91b7057af1b7cef38d4c63b9ac3422e031760e33810ebe3f3ff1f"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.758014 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.759001 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" event={"ID":"e1528346-60ec-4879-81b0-72027f1d1477","Type":"ContainerStarted","Data":"fac926deeddc1f21c400244b6cd3a55368d5330a4b28a7895fda3987b0ce8abb"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.764131 4738 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-pfjd6 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" start-of-body= Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.764219 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" podUID="2fd5c3c8-b569-4de9-af5f-8c59e3bb2c0c" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.28:8443/healthz\": dial tcp 10.217.0.28:8443: connect: connection refused" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.773032 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" event={"ID":"7c5a4f3a-15c2-4513-9f96-b24e96e51cc7","Type":"ContainerStarted","Data":"d2a630f779198ba2b5946eb6c4ec93be643a960195a4851a9b197554ee720fa1"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.773711 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rwwsd" podStartSLOduration=8.773693333 podStartE2EDuration="8.773693333s" podCreationTimestamp="2026-03-07 07:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.767963092 +0000 UTC m=+250.232950413" watchObservedRunningTime="2026-03-07 07:03:51.773693333 +0000 UTC m=+250.238680644" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.774034 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.783516 4738 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-l7jfp container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.783577 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" podUID="7c5a4f3a-15c2-4513-9f96-b24e96e51cc7" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.788034 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" event={"ID":"c1ebf6ab-7086-429d-9b03-fdc278235e3b","Type":"ContainerStarted","Data":"93cd4d890fba7cfcf5dab8b59705e12dfbcab8a49c63fb139ea5e4a46a225af0"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.788097 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" event={"ID":"c1ebf6ab-7086-429d-9b03-fdc278235e3b","Type":"ContainerStarted","Data":"3525317de5be7a45ba8a5d0d4c02ef4104d1d3be5a8c4abe332590f1cc26f71c"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.791312 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qx5sp" podStartSLOduration=179.791294939 podStartE2EDuration="2m59.791294939s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.790531009 +0000 UTC m=+250.255518330" watchObservedRunningTime="2026-03-07 07:03:51.791294939 +0000 UTC m=+250.256282250" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.818175 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" podStartSLOduration=179.818145681 podStartE2EDuration="2m59.818145681s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.816630851 +0000 UTC m=+250.281618172" watchObservedRunningTime="2026-03-07 07:03:51.818145681 +0000 UTC m=+250.283133002" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.825847 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" event={"ID":"a89ff68b-6456-4756-b991-5ab40a3f3dfe","Type":"ContainerStarted","Data":"877476d0785a69e692f91e9092a7aebff1a7a097c77f8d619c203067367df17c"} Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.826265 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.827721 4738 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-7vtnd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" start-of-body= Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.827772 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.24:6443/healthz\": dial tcp 10.217.0.24:6443: connect: connection refused" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.828754 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.828808 4738 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-67thp container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.828834 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.830515 4738 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-xd7zm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.830537 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": dial tcp 10.217.0.14:8443: connect: connection refused" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.839925 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.842949 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.342933848 +0000 UTC m=+250.807921169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.849012 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-c9btk" podStartSLOduration=179.848991718 podStartE2EDuration="2m59.848991718s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.84640992 +0000 UTC m=+250.311397241" watchObservedRunningTime="2026-03-07 07:03:51.848991718 +0000 UTC m=+250.313979039" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.911641 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" podStartSLOduration=179.911621647 podStartE2EDuration="2m59.911621647s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.910997241 +0000 UTC m=+250.375984562" watchObservedRunningTime="2026-03-07 07:03:51.911621647 +0000 UTC m=+250.376608968" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.912510 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-lmc29" podStartSLOduration=7.91250508 podStartE2EDuration="7.91250508s" podCreationTimestamp="2026-03-07 07:03:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.885801203 +0000 UTC m=+250.350788524" watchObservedRunningTime="2026-03-07 07:03:51.91250508 +0000 UTC m=+250.377492401" Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.944715 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:51 crc kubenswrapper[4738]: E0307 07:03:51.947362 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.447329853 +0000 UTC m=+250.912317174 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:51 crc kubenswrapper[4738]: I0307 07:03:51.959552 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-5frsf" podStartSLOduration=179.959531406 podStartE2EDuration="2m59.959531406s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:51.956842095 +0000 UTC m=+250.421829416" watchObservedRunningTime="2026-03-07 07:03:51.959531406 +0000 UTC m=+250.424518727" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.052025 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.052562 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.55254283 +0000 UTC m=+251.017530151 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.063029 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" podStartSLOduration=181.063012938 podStartE2EDuration="3m1.063012938s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:52.037506402 +0000 UTC m=+250.502493743" watchObservedRunningTime="2026-03-07 07:03:52.063012938 +0000 UTC m=+250.528000259" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.071565 4738 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k66nh container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.071651 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" podUID="9b53e2fb-773e-4c0e-a251-c0fdc80b66f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.072021 4738 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-k66nh container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.072049 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" podUID="9b53e2fb-773e-4c0e-a251-c0fdc80b66f8" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.150068 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" podStartSLOduration=180.150048453 podStartE2EDuration="3m0.150048453s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:52.094541853 +0000 UTC m=+250.559529194" watchObservedRunningTime="2026-03-07 07:03:52.150048453 +0000 UTC m=+250.615035774" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.152870 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.153258 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.653243248 +0000 UTC m=+251.118230569 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.215736 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-b7569" podStartSLOduration=180.215717374 podStartE2EDuration="3m0.215717374s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:52.214044199 +0000 UTC m=+250.679031530" watchObservedRunningTime="2026-03-07 07:03:52.215717374 +0000 UTC m=+250.680704695" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.216468 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" podStartSLOduration=180.216463323 podStartE2EDuration="3m0.216463323s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:52.151280947 +0000 UTC m=+250.616268278" watchObservedRunningTime="2026-03-07 07:03:52.216463323 +0000 UTC m=+250.681450644" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.254421 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.254782 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.754770818 +0000 UTC m=+251.219758139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.355468 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.355663 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.855636231 +0000 UTC m=+251.320623552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.355770 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.356125 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.856114113 +0000 UTC m=+251.321101424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.413722 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" podStartSLOduration=181.413705348 podStartE2EDuration="3m1.413705348s" podCreationTimestamp="2026-03-07 07:00:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:52.251943934 +0000 UTC m=+250.716931265" watchObservedRunningTime="2026-03-07 07:03:52.413705348 +0000 UTC m=+250.878692669" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.456687 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.457124 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:52.957109088 +0000 UTC m=+251.422096409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.463639 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:52 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:52 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:52 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.463720 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.558185 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.558601 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.058585397 +0000 UTC m=+251.523572718 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.659277 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.660250 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.160221119 +0000 UTC m=+251.625208440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.761454 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.761881 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.261863952 +0000 UTC m=+251.726851273 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.823196 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-p7l6z" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.854725 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" event={"ID":"599d27ce-56f1-452b-bb67-2ad0178c9d61","Type":"ContainerStarted","Data":"92af9d3eedee8eda322309a388cd0586588b374fa58ca34482683305ab932ca2"} Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.862554 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.862899 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.362883878 +0000 UTC m=+251.827871199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.875215 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.877264 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-l7jfp" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.881723 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-pfjd6" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.957700 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55512: no serving certificate available for the kubelet" Mar 07 07:03:52 crc kubenswrapper[4738]: I0307 07:03:52.964873 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:52 crc kubenswrapper[4738]: E0307 07:03:52.969151 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.469126673 +0000 UTC m=+251.934113994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.067947 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.068203 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.568135276 +0000 UTC m=+252.033122597 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.068241 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.068632 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.568613828 +0000 UTC m=+252.033601149 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.077382 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55528: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.139077 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.139857 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.143580 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.143643 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.162365 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.171395 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.171764 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.67174699 +0000 UTC m=+252.136734311 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.180869 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55538: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.272650 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.272973 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.273057 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.273389 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.773376642 +0000 UTC m=+252.238363963 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.285177 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55540: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.374168 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.374389 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.874346517 +0000 UTC m=+252.339333838 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.374604 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.374766 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.374798 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.374980 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.375316 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.875292243 +0000 UTC m=+252.340279744 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.398690 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55556: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.417382 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.461719 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:53 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:53 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:53 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.461807 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.462406 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.475507 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.475954 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:53.975938199 +0000 UTC m=+252.440925520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.562607 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55560: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.577099 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.577511 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.07749753 +0000 UTC m=+252.542484851 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.595238 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55574: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.682092 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.682515 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.182443719 +0000 UTC m=+252.647431040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.683036 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.683451 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.183438866 +0000 UTC m=+252.648426177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.783871 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.784385 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.28436273 +0000 UTC m=+252.749350051 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.791592 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55578: no serving certificate available for the kubelet" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.850338 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-k66nh" Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.888094 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.888504 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.388489598 +0000 UTC m=+252.853476919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.919864 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 07 07:03:53 crc kubenswrapper[4738]: I0307 07:03:53.990026 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:53 crc kubenswrapper[4738]: E0307 07:03:53.991531 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.491507268 +0000 UTC m=+252.956494589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.095078 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.095695 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.595671637 +0000 UTC m=+253.060658958 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.118728 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.119892 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.125429 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.146759 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.197830 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.198334 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.698312056 +0000 UTC m=+253.163299377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.280691 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.281913 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.284170 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.299738 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.299791 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhg8\" (UniqueName: \"kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.299821 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.299858 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.300275 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.800257657 +0000 UTC m=+253.265244978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.316712 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.401967 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.402316 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.902268059 +0000 UTC m=+253.367255380 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.402377 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.402443 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kwcq\" (UniqueName: \"kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.402540 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.402566 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.404404 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.404461 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhg8\" (UniqueName: \"kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.404505 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.405133 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:54.905113495 +0000 UTC m=+253.370100816 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.405739 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.417704 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.450053 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhg8\" (UniqueName: \"kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8\") pod \"certified-operators-t6bh9\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.456030 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.472823 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.477051 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.477329 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:54 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:54 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:54 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.477399 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.486132 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55588: no serving certificate available for the kubelet" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.494205 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.505926 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.506231 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kwcq\" (UniqueName: \"kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.506270 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.506289 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.507218 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.507304 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.007288442 +0000 UTC m=+253.472275763 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.507760 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.559386 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kwcq\" (UniqueName: \"kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq\") pod \"community-operators-587ww\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.607458 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.608518 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.608685 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd8gn\" (UniqueName: \"kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.608773 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.609179 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.10914902 +0000 UTC m=+253.574136341 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.619664 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.652689 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.708674 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.709922 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.710205 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.710432 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd8gn\" (UniqueName: \"kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.710502 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.210476894 +0000 UTC m=+253.675464415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.710534 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.710618 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.711209 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.711306 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.742402 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.745413 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd8gn\" (UniqueName: \"kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn\") pod \"certified-operators-flk6x\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.815999 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.816073 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.816102 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5l74\" (UniqueName: \"kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.816188 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.816544 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.316532334 +0000 UTC m=+253.781519655 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.819500 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.822379 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" containerID="cri-o://ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162" gracePeriod=30 Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.834238 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.881072 4738 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-5j9bx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.881141 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.5:8443/healthz\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.916783 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.917284 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.917348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.917375 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5l74\" (UniqueName: \"kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: E0307 07:03:54.917721 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.417707004 +0000 UTC m=+253.882694325 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.918090 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.918318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.934351 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7f0de6fb-1e9a-4678-821a-533eb8bb9766","Type":"ContainerStarted","Data":"a22b33c7fab708b5d277164867f082d44a53ad1680579f64598a901e757f40e5"} Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.934429 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7f0de6fb-1e9a-4678-821a-533eb8bb9766","Type":"ContainerStarted","Data":"f12a9a3e7c164534650af38086e8faf79faa31c7cc258e831b8b22d33cca4e79"} Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.940443 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5l74\" (UniqueName: \"kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74\") pod \"community-operators-k4rs8\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.956496 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" event={"ID":"599d27ce-56f1-452b-bb67-2ad0178c9d61","Type":"ContainerStarted","Data":"95b39bf314d468aa70a1ed9092cf7595d915500a2ad8ee0117a2953773bd5ae2"} Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.956711 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerName="controller-manager" containerID="cri-o://1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691" gracePeriod=30 Mar 07 07:03:54 crc kubenswrapper[4738]: I0307 07:03:54.973711 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.973679637 podStartE2EDuration="1.973679637s" podCreationTimestamp="2026-03-07 07:03:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:54.971667235 +0000 UTC m=+253.436654556" watchObservedRunningTime="2026-03-07 07:03:54.973679637 +0000 UTC m=+253.438666948" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.022345 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.022821 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.522803839 +0000 UTC m=+253.987791160 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.037394 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.125802 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.127173 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.627137143 +0000 UTC m=+254.092124464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.233181 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.233726 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.733714116 +0000 UTC m=+254.198701437 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.243399 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.288895 4738 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.302857 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.333984 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.334220 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.834184238 +0000 UTC m=+254.299171559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.334293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.335031 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.83502305 +0000 UTC m=+254.300010371 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.435387 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.436111 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:55.936092038 +0000 UTC m=+254.401079359 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.461308 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:55 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:55 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:55 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.461653 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.540702 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.541511 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:56.0414993 +0000 UTC m=+254.506486621 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.641955 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.642681 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:56.14265444 +0000 UTC m=+254.607641761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.726863 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.745516 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.745980 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-07 07:03:56.245962977 +0000 UTC m=+254.710950298 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-gsfcg" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.748593 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.753108 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:03:55 crc kubenswrapper[4738]: W0307 07:03:55.772842 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef082c9b_8cc2_4c38_8957_28912470b473.slice/crio-e1304227800a2b091ce04b4f208ae652cdb6f61393ec12750bc74bcdf7ebd7c9 WatchSource:0}: Error finding container e1304227800a2b091ce04b4f208ae652cdb6f61393ec12750bc74bcdf7ebd7c9: Status 404 returned error can't find the container with id e1304227800a2b091ce04b4f208ae652cdb6f61393ec12750bc74bcdf7ebd7c9 Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.818362 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.818886 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55604: no serving certificate available for the kubelet" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.845259 4738 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-07T07:03:55.288930619Z","Handler":null,"Name":""} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847543 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847627 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config\") pod \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847684 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config\") pod \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847726 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7lg2v\" (UniqueName: \"kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v\") pod \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847763 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca\") pod \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847800 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles\") pod \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847834 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert\") pod \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847859 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert\") pod \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847907 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdqgb\" (UniqueName: \"kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb\") pod \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\" (UID: \"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c\") " Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.847951 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca\") pod \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\" (UID: \"ad2266fc-521b-461f-ad3e-ef33b89c23d4\") " Mar 07 07:03:55 crc kubenswrapper[4738]: E0307 07:03:55.848563 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-07 07:03:56.348526875 +0000 UTC m=+254.813514206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.849277 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca" (OuterVolumeSpecName: "client-ca") pod "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" (UID: "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.849316 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad2266fc-521b-461f-ad3e-ef33b89c23d4" (UID: "ad2266fc-521b-461f-ad3e-ef33b89c23d4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.849285 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ad2266fc-521b-461f-ad3e-ef33b89c23d4" (UID: "ad2266fc-521b-461f-ad3e-ef33b89c23d4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.849877 4738 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.849927 4738 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.850037 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config" (OuterVolumeSpecName: "config") pod "ad2266fc-521b-461f-ad3e-ef33b89c23d4" (UID: "ad2266fc-521b-461f-ad3e-ef33b89c23d4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.850058 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config" (OuterVolumeSpecName: "config") pod "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" (UID: "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.857868 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" (UID: "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.875861 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad2266fc-521b-461f-ad3e-ef33b89c23d4" (UID: "ad2266fc-521b-461f-ad3e-ef33b89c23d4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.876051 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v" (OuterVolumeSpecName: "kube-api-access-7lg2v") pod "ad2266fc-521b-461f-ad3e-ef33b89c23d4" (UID: "ad2266fc-521b-461f-ad3e-ef33b89c23d4"). InnerVolumeSpecName "kube-api-access-7lg2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.876150 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb" (OuterVolumeSpecName: "kube-api-access-jdqgb") pod "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" (UID: "8b9ad8ed-a9c2-40ce-8664-c59a4911c88c"). InnerVolumeSpecName "kube-api-access-jdqgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949011 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949138 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949182 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949192 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949200 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7lg2v\" (UniqueName: \"kubernetes.io/projected/ad2266fc-521b-461f-ad3e-ef33b89c23d4-kube-api-access-7lg2v\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949209 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949217 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad2266fc-521b-461f-ad3e-ef33b89c23d4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949226 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad2266fc-521b-461f-ad3e-ef33b89c23d4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949237 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.949248 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdqgb\" (UniqueName: \"kubernetes.io/projected/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c-kube-api-access-jdqgb\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.965484 4738 generic.go:334] "Generic (PLEG): container finished" podID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerID="1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691" exitCode=0 Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.965569 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" event={"ID":"ad2266fc-521b-461f-ad3e-ef33b89c23d4","Type":"ContainerDied","Data":"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.965601 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" event={"ID":"ad2266fc-521b-461f-ad3e-ef33b89c23d4","Type":"ContainerDied","Data":"b2c3baebcaecd93e265a99404b6f61ba74055991fc8e6d8ad53610ce2cecc5e2"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.965621 4738 scope.go:117] "RemoveContainer" containerID="1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.965642 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-xd7zm" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.967903 4738 generic.go:334] "Generic (PLEG): container finished" podID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerID="338ecded0752c2690325db46f9d6ecc7a6a65f1919097f4cfe6ca4ab6b577f18" exitCode=0 Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.968020 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerDied","Data":"338ecded0752c2690325db46f9d6ecc7a6a65f1919097f4cfe6ca4ab6b577f18"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.968085 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerStarted","Data":"b50475eefaa9e7509eefbd849ea95eabe6da405b3716000557b5c4dac3dd82f9"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.969708 4738 generic.go:334] "Generic (PLEG): container finished" podID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerID="ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162" exitCode=0 Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.969776 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" event={"ID":"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c","Type":"ContainerDied","Data":"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.969799 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" event={"ID":"8b9ad8ed-a9c2-40ce-8664-c59a4911c88c","Type":"ContainerDied","Data":"db35ddaedb3788339ed0a6bdfa4a8f715fd56601871c6cb2514ca0094971f58a"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.969802 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx" Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.975905 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" event={"ID":"599d27ce-56f1-452b-bb67-2ad0178c9d61","Type":"ContainerStarted","Data":"7d4c2f8dd48212d7b62b20458871359024cfbb5e43a65f626960e24ebe25b647"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.976729 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerStarted","Data":"1884d075e7c670c1618a31b9945e8bdb3f3f5615f09fc06cca1866ff2fd12c77"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.979133 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerStarted","Data":"e1304227800a2b091ce04b4f208ae652cdb6f61393ec12750bc74bcdf7ebd7c9"} Mar 07 07:03:55 crc kubenswrapper[4738]: I0307 07:03:55.995971 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerStarted","Data":"d3f135f1d2830f76a1c904bfa0d22a9cc8ce59fd39a839728b3f5119bb533a11"} Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.000007 4738 scope.go:117] "RemoveContainer" containerID="1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.000232 4738 generic.go:334] "Generic (PLEG): container finished" podID="7f0de6fb-1e9a-4678-821a-533eb8bb9766" containerID="a22b33c7fab708b5d277164867f082d44a53ad1680579f64598a901e757f40e5" exitCode=0 Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.000347 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7f0de6fb-1e9a-4678-821a-533eb8bb9766","Type":"ContainerDied","Data":"a22b33c7fab708b5d277164867f082d44a53ad1680579f64598a901e757f40e5"} Mar 07 07:03:56 crc kubenswrapper[4738]: E0307 07:03:56.000782 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691\": container with ID starting with 1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691 not found: ID does not exist" containerID="1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.000913 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691"} err="failed to get container status \"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691\": rpc error: code = NotFound desc = could not find container \"1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691\": container with ID starting with 1aeb12c8ea6b9ecdc9a1ed1929f263c7e9e9dcb8e96d2cef98b224a6abbe8691 not found: ID does not exist" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.001025 4738 scope.go:117] "RemoveContainer" containerID="ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.000809 4738 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.001286 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.006131 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.011714 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-xd7zm"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.013029 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.013073 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.022807 4738 patch_prober.go:28] interesting pod/apiserver-76f77b778f-rb6b7 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]log ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]etcd ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/generic-apiserver-start-informers ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/max-in-flight-filter ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 07 07:03:56 crc kubenswrapper[4738]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 07 07:03:56 crc kubenswrapper[4738]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/project.openshift.io-projectcache ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-startinformers ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 07 07:03:56 crc kubenswrapper[4738]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 07 07:03:56 crc kubenswrapper[4738]: livez check failed Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.022866 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" podUID="a89ff68b-6456-4756-b991-5ab40a3f3dfe" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.041443 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.041916 4738 scope.go:117] "RemoveContainer" containerID="ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162" Mar 07 07:03:56 crc kubenswrapper[4738]: E0307 07:03:56.042437 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162\": container with ID starting with ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162 not found: ID does not exist" containerID="ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.042467 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162"} err="failed to get container status \"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162\": rpc error: code = NotFound desc = could not find container \"ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162\": container with ID starting with ab526ceaeb29a98d49bb2d2bb5b2f4b8c67fa3bbe9a0d81aab6ab6aead176162 not found: ID does not exist" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.045436 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-5j9bx"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.051238 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-gsfcg\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.074608 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.150785 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.154991 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.187132 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.187206 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.199344 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.255461 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:03:56 crc kubenswrapper[4738]: E0307 07:03:56.255678 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerName="controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.255691 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerName="controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: E0307 07:03:56.255709 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.255715 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.255808 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" containerName="route-controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.255825 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" containerName="controller-manager" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.258280 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.262614 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.267031 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.335078 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.397635 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b9ad8ed-a9c2-40ce-8664-c59a4911c88c" path="/var/lib/kubelet/pods/8b9ad8ed-a9c2-40ce-8664-c59a4911c88c/volumes" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.399171 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.401330 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2266fc-521b-461f-ad3e-ef33b89c23d4" path="/var/lib/kubelet/pods/ad2266fc-521b-461f-ad3e-ef33b89c23d4/volumes" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.415376 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.416485 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.418257 4738 patch_prober.go:28] interesting pod/console-f9d7485db-rvllr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.418315 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rvllr" podUID="e863c889-47c2-459d-a84f-dc360fe3098f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.454528 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.454602 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2w9b\" (UniqueName: \"kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.454733 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.457990 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.463120 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:56 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:56 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:56 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.463208 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.553607 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.555669 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2w9b\" (UniqueName: \"kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.555869 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.556137 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.556623 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.556866 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.557576 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.560647 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.562240 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.566223 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.579556 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2w9b\" (UniqueName: \"kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b\") pod \"redhat-marketplace-rnt9k\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.582334 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.638641 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.639860 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.642493 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.643469 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.644298 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.644493 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.646048 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.646172 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.646307 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.646602 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.646797 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.650537 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.658261 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.658469 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.662915 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.662984 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.663293 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.663440 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.663488 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666784 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666830 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhnsz\" (UniqueName: \"kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666863 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666885 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666926 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666943 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.666967 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.667026 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.667296 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.667330 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.667370 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms4z7\" (UniqueName: \"kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.674829 4738 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwk6z container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.674883 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-wwk6z" podUID="3d9a181b-98f6-4420-90fb-6df876c703a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.674926 4738 patch_prober.go:28] interesting pod/downloads-7954f5f757-wwk6z container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" start-of-body= Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.675009 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-wwk6z" podUID="3d9a181b-98f6-4420-90fb-6df876c703a0" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.15:8080/\": dial tcp 10.217.0.15:8080: connect: connection refused" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.680591 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.683460 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.687863 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.769527 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770150 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770217 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770263 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770286 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770305 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvzg2\" (UniqueName: \"kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770341 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770373 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770411 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms4z7\" (UniqueName: \"kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770443 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770468 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770494 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhnsz\" (UniqueName: \"kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770521 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.770858 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.771607 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.772043 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.772354 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.773010 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.774216 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.787835 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms4z7\" (UniqueName: \"kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.787942 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert\") pod \"route-controller-manager-6c98444477-mnjxk\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.796970 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.799830 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.800316 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhnsz\" (UniqueName: \"kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz\") pod \"controller-manager-58445c889b-vqh7f\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.871308 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvzg2\" (UniqueName: \"kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.871420 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.871490 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.871881 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.872018 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.884541 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.887601 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvzg2\" (UniqueName: \"kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2\") pod \"redhat-marketplace-67hw7\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:56 crc kubenswrapper[4738]: W0307 07:03:56.898457 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ccab57d_f355_494a_adae_5a1dba9c360a.slice/crio-f9479bfa25053e3452ceb83bed98544fe94e3e84ca6e014f3db218720e3df08b WatchSource:0}: Error finding container f9479bfa25053e3452ceb83bed98544fe94e3e84ca6e014f3db218720e3df08b: Status 404 returned error can't find the container with id f9479bfa25053e3452ceb83bed98544fe94e3e84ca6e014f3db218720e3df08b Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.910543 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.931764 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.960503 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.960573 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:03:56 crc kubenswrapper[4738]: I0307 07:03:56.987886 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.005750 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.020975 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.053988 4738 generic.go:334] "Generic (PLEG): container finished" podID="e1528346-60ec-4879-81b0-72027f1d1477" containerID="fac926deeddc1f21c400244b6cd3a55368d5330a4b28a7895fda3987b0ce8abb" exitCode=0 Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.054097 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" event={"ID":"e1528346-60ec-4879-81b0-72027f1d1477","Type":"ContainerDied","Data":"fac926deeddc1f21c400244b6cd3a55368d5330a4b28a7895fda3987b0ce8abb"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.081364 4738 generic.go:334] "Generic (PLEG): container finished" podID="ef082c9b-8cc2-4c38-8957-28912470b473" containerID="f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11" exitCode=0 Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.081942 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerDied","Data":"f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.090551 4738 generic.go:334] "Generic (PLEG): container finished" podID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerID="971cdad0a3131bc196b1db49e70b891178c9c01db44d9bb41e01e7435211703e" exitCode=0 Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.090761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerDied","Data":"971cdad0a3131bc196b1db49e70b891178c9c01db44d9bb41e01e7435211703e"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.115782 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerStarted","Data":"f9479bfa25053e3452ceb83bed98544fe94e3e84ca6e014f3db218720e3df08b"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.170771 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.175994 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" event={"ID":"4e5277c1-ff6d-49c3-9443-19d8f98cae68","Type":"ContainerStarted","Data":"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.176055 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" event={"ID":"4e5277c1-ff6d-49c3-9443-19d8f98cae68","Type":"ContainerStarted","Data":"58abc2d40c524324709678519ec00b0553712c5bbc077046b857459dd9cafe07"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.177036 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.217991 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" podStartSLOduration=185.217973583 podStartE2EDuration="3m5.217973583s" podCreationTimestamp="2026-03-07 07:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:57.210528357 +0000 UTC m=+255.675515678" watchObservedRunningTime="2026-03-07 07:03:57.217973583 +0000 UTC m=+255.682960904" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.221362 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" event={"ID":"599d27ce-56f1-452b-bb67-2ad0178c9d61","Type":"ContainerStarted","Data":"5694af2a3b44c9a7fef0d189d29fb681cdf2e99b289832dde95ecbfa6dd7f470"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.237238 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerID="b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73" exitCode=0 Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.239245 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerDied","Data":"b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73"} Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.259861 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.264340 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-ksjbb" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.270246 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.270505 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lqf85" podStartSLOduration=14.270478544 podStartE2EDuration="14.270478544s" podCreationTimestamp="2026-03-07 07:03:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:57.262727359 +0000 UTC m=+255.727714680" watchObservedRunningTime="2026-03-07 07:03:57.270478544 +0000 UTC m=+255.735465865" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.271580 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.283543 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.304276 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.388041 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.388150 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ph6\" (UniqueName: \"kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.388259 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.409745 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.463449 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:57 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:57 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:57 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.463512 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.492000 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.492063 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ph6\" (UniqueName: \"kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.492097 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.492710 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.497494 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.563593 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ph6\" (UniqueName: \"kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6\") pod \"redhat-operators-gfmjp\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.624137 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.636339 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.679137 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:03:57 crc kubenswrapper[4738]: W0307 07:03:57.680287 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28c5fa97_13ce_4c5c_83b5_fdd02611de9c.slice/crio-0136e5e3e41afe9f53d74f1f3f8e996888f378737ac31263511ca611cacf018d WatchSource:0}: Error finding container 0136e5e3e41afe9f53d74f1f3f8e996888f378737ac31263511ca611cacf018d: Status 404 returned error can't find the container with id 0136e5e3e41afe9f53d74f1f3f8e996888f378737ac31263511ca611cacf018d Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.680895 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.682269 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.691968 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.799274 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.799662 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.799769 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6n8\" (UniqueName: \"kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.848763 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.900864 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6n8\" (UniqueName: \"kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.900930 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.900954 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.901593 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.901915 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.928246 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:03:57 crc kubenswrapper[4738]: I0307 07:03:57.940321 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6n8\" (UniqueName: \"kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8\") pod \"redhat-operators-mmvgq\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:57 crc kubenswrapper[4738]: W0307 07:03:57.942693 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11b75931_7617_4db8_b9e0_62a32ccd6948.slice/crio-29ef7d41857452915f61a273e1a2785ef0f0e6560e8fbfeb71c28e248765fd85 WatchSource:0}: Error finding container 29ef7d41857452915f61a273e1a2785ef0f0e6560e8fbfeb71c28e248765fd85: Status 404 returned error can't find the container with id 29ef7d41857452915f61a273e1a2785ef0f0e6560e8fbfeb71c28e248765fd85 Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.003905 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access\") pod \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.004646 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir\") pod \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\" (UID: \"7f0de6fb-1e9a-4678-821a-533eb8bb9766\") " Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.005443 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7f0de6fb-1e9a-4678-821a-533eb8bb9766" (UID: "7f0de6fb-1e9a-4678-821a-533eb8bb9766"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.018433 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7f0de6fb-1e9a-4678-821a-533eb8bb9766" (UID: "7f0de6fb-1e9a-4678-821a-533eb8bb9766"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.019856 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.021559 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.021670 4738 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f0de6fb-1e9a-4678-821a-533eb8bb9766-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.110197 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.349249 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fab6aabb-c230-4043-94b0-b3ab1e6a8891","Type":"ContainerStarted","Data":"2f434fe410df71475c39b43d9156431d7783e579a278d8c20e1cd92cf5091974"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.354112 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerStarted","Data":"55a2a1e2912bd12ed176110db8b60c8fc33ec3080e158bd64855676c21b3663b"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.359338 4738 generic.go:334] "Generic (PLEG): container finished" podID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerID="fe078cb41a33ede83d13f77e23e14f8f237234696ed5a881623a0c96c5277d19" exitCode=0 Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.359459 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerDied","Data":"fe078cb41a33ede83d13f77e23e14f8f237234696ed5a881623a0c96c5277d19"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.359494 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerStarted","Data":"29ef7d41857452915f61a273e1a2785ef0f0e6560e8fbfeb71c28e248765fd85"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.390331 4738 generic.go:334] "Generic (PLEG): container finished" podID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerID="48702995f20f0e43a4d64658a812c36ee8d8be5ef48a81f37af7c66d66fbc1d9" exitCode=0 Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.396102 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerDied","Data":"48702995f20f0e43a4d64658a812c36ee8d8be5ef48a81f37af7c66d66fbc1d9"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.401021 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" event={"ID":"28c5fa97-13ce-4c5c-83b5-fdd02611de9c","Type":"ContainerStarted","Data":"845ac5a2a2705e56998b6143ff6c4333356f196445bd7839a37d53551017f09a"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.401069 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" event={"ID":"28c5fa97-13ce-4c5c-83b5-fdd02611de9c","Type":"ContainerStarted","Data":"0136e5e3e41afe9f53d74f1f3f8e996888f378737ac31263511ca611cacf018d"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.401429 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.408756 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"7f0de6fb-1e9a-4678-821a-533eb8bb9766","Type":"ContainerDied","Data":"f12a9a3e7c164534650af38086e8faf79faa31c7cc258e831b8b22d33cca4e79"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.408804 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f12a9a3e7c164534650af38086e8faf79faa31c7cc258e831b8b22d33cca4e79" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.408899 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.417058 4738 ???:1] "http: TLS handshake error from 192.168.126.11:55610: no serving certificate available for the kubelet" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.424741 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" event={"ID":"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b","Type":"ContainerStarted","Data":"b410e5d6dde4b1b3270ed83fb1769e66989622d81d5915d449f20bac269525c0"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.424781 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" event={"ID":"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b","Type":"ContainerStarted","Data":"d941e594eb2d74c696d3501c8326b5f3fad891590a0205193e31c65719f479b1"} Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.424795 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.426902 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.454483 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" podStartSLOduration=3.454465501 podStartE2EDuration="3.454465501s" podCreationTimestamp="2026-03-07 07:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:58.450199587 +0000 UTC m=+256.915187028" watchObservedRunningTime="2026-03-07 07:03:58.454465501 +0000 UTC m=+256.919452822" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.471305 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:58 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:58 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:58 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.471368 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.472026 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" podStartSLOduration=3.471999025 podStartE2EDuration="3.471999025s" podCreationTimestamp="2026-03-07 07:03:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:58.469689994 +0000 UTC m=+256.934677315" watchObservedRunningTime="2026-03-07 07:03:58.471999025 +0000 UTC m=+256.936986346" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.538697 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.574067 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:03:58 crc kubenswrapper[4738]: I0307 07:03:58.949756 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.058408 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-255nm\" (UniqueName: \"kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm\") pod \"e1528346-60ec-4879-81b0-72027f1d1477\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.058825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume\") pod \"e1528346-60ec-4879-81b0-72027f1d1477\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.058922 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume\") pod \"e1528346-60ec-4879-81b0-72027f1d1477\" (UID: \"e1528346-60ec-4879-81b0-72027f1d1477\") " Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.061252 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume" (OuterVolumeSpecName: "config-volume") pod "e1528346-60ec-4879-81b0-72027f1d1477" (UID: "e1528346-60ec-4879-81b0-72027f1d1477"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.070129 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e1528346-60ec-4879-81b0-72027f1d1477" (UID: "e1528346-60ec-4879-81b0-72027f1d1477"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.070189 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm" (OuterVolumeSpecName: "kube-api-access-255nm") pod "e1528346-60ec-4879-81b0-72027f1d1477" (UID: "e1528346-60ec-4879-81b0-72027f1d1477"). InnerVolumeSpecName "kube-api-access-255nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.165012 4738 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e1528346-60ec-4879-81b0-72027f1d1477-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.165069 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-255nm\" (UniqueName: \"kubernetes.io/projected/e1528346-60ec-4879-81b0-72027f1d1477-kube-api-access-255nm\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.165082 4738 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e1528346-60ec-4879-81b0-72027f1d1477-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.456778 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerStarted","Data":"c68a8c4b9b1c003e4ad1618884fdacba43bdfbf231432d052f03e7bd7c760c98"} Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.466979 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:03:59 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:03:59 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:03:59 crc kubenswrapper[4738]: healthz check failed Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.467071 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.484847 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fab6aabb-c230-4043-94b0-b3ab1e6a8891","Type":"ContainerStarted","Data":"3c576693f0a2fbdd0a28c4f9073a6e2500d8ddd57710f6d6ed3c0382e96bd4b7"} Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.497589 4738 generic.go:334] "Generic (PLEG): container finished" podID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerID="eb86e95a5c935a21690af18bdaeb2732f5ef1b762aa6c887cc26353e53e183d3" exitCode=0 Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.497653 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerDied","Data":"eb86e95a5c935a21690af18bdaeb2732f5ef1b762aa6c887cc26353e53e183d3"} Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.502727 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.502711561 podStartE2EDuration="3.502711561s" podCreationTimestamp="2026-03-07 07:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:03:59.501494649 +0000 UTC m=+257.966481970" watchObservedRunningTime="2026-03-07 07:03:59.502711561 +0000 UTC m=+257.967698882" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.545904 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" event={"ID":"e1528346-60ec-4879-81b0-72027f1d1477","Type":"ContainerDied","Data":"6665d6e29556438d724045bb4a48c66b142d07234722dc3ccf1fd38a073fa84b"} Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.546745 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6665d6e29556438d724045bb4a48c66b142d07234722dc3ccf1fd38a073fa84b" Mar 07 07:03:59 crc kubenswrapper[4738]: I0307 07:03:59.547101 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.137061 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547784-vqg4n"] Mar 07 07:04:00 crc kubenswrapper[4738]: E0307 07:04:00.138732 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f0de6fb-1e9a-4678-821a-533eb8bb9766" containerName="pruner" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.138749 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f0de6fb-1e9a-4678-821a-533eb8bb9766" containerName="pruner" Mar 07 07:04:00 crc kubenswrapper[4738]: E0307 07:04:00.138773 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1528346-60ec-4879-81b0-72027f1d1477" containerName="collect-profiles" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.138805 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1528346-60ec-4879-81b0-72027f1d1477" containerName="collect-profiles" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.138958 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f0de6fb-1e9a-4678-821a-533eb8bb9766" containerName="pruner" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.138974 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1528346-60ec-4879-81b0-72027f1d1477" containerName="collect-profiles" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.139525 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.141611 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.142346 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-vqg4n"] Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.281867 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dg4q9\" (UniqueName: \"kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9\") pod \"auto-csr-approver-29547784-vqg4n\" (UID: \"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039\") " pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.383601 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dg4q9\" (UniqueName: \"kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9\") pod \"auto-csr-approver-29547784-vqg4n\" (UID: \"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039\") " pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.419496 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dg4q9\" (UniqueName: \"kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9\") pod \"auto-csr-approver-29547784-vqg4n\" (UID: \"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039\") " pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.461190 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:00 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:00 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:00 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.461279 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.462566 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.557657 4738 generic.go:334] "Generic (PLEG): container finished" podID="fab6aabb-c230-4043-94b0-b3ab1e6a8891" containerID="3c576693f0a2fbdd0a28c4f9073a6e2500d8ddd57710f6d6ed3c0382e96bd4b7" exitCode=0 Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.557780 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fab6aabb-c230-4043-94b0-b3ab1e6a8891","Type":"ContainerDied","Data":"3c576693f0a2fbdd0a28c4f9073a6e2500d8ddd57710f6d6ed3c0382e96bd4b7"} Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.570534 4738 generic.go:334] "Generic (PLEG): container finished" podID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerID="c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057" exitCode=0 Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.571850 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerDied","Data":"c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057"} Mar 07 07:04:00 crc kubenswrapper[4738]: I0307 07:04:00.743196 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-vqg4n"] Mar 07 07:04:01 crc kubenswrapper[4738]: I0307 07:04:01.018457 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:04:01 crc kubenswrapper[4738]: I0307 07:04:01.024381 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-rb6b7" Mar 07 07:04:01 crc kubenswrapper[4738]: I0307 07:04:01.461342 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:01 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:01 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:01 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:01 crc kubenswrapper[4738]: I0307 07:04:01.461404 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:02 crc kubenswrapper[4738]: I0307 07:04:02.363864 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-lmc29" Mar 07 07:04:02 crc kubenswrapper[4738]: I0307 07:04:02.460862 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:02 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:02 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:02 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:02 crc kubenswrapper[4738]: I0307 07:04:02.460973 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:03 crc kubenswrapper[4738]: I0307 07:04:03.370776 4738 ???:1] "http: TLS handshake error from 192.168.126.11:36664: no serving certificate available for the kubelet" Mar 07 07:04:03 crc kubenswrapper[4738]: I0307 07:04:03.460192 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:03 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:03 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:03 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:03 crc kubenswrapper[4738]: I0307 07:04:03.460284 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:03 crc kubenswrapper[4738]: I0307 07:04:03.566000 4738 ???:1] "http: TLS handshake error from 192.168.126.11:36680: no serving certificate available for the kubelet" Mar 07 07:04:04 crc kubenswrapper[4738]: I0307 07:04:04.461470 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:04 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:04 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:04 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:04 crc kubenswrapper[4738]: I0307 07:04:04.461761 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:05 crc kubenswrapper[4738]: I0307 07:04:05.460966 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:05 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:05 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:05 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:05 crc kubenswrapper[4738]: I0307 07:04:05.461055 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:06 crc kubenswrapper[4738]: I0307 07:04:06.415668 4738 patch_prober.go:28] interesting pod/console-f9d7485db-rvllr container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 07 07:04:06 crc kubenswrapper[4738]: I0307 07:04:06.415760 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-rvllr" podUID="e863c889-47c2-459d-a84f-dc360fe3098f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.6:8443/health\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 07 07:04:06 crc kubenswrapper[4738]: I0307 07:04:06.461670 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:06 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:06 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:06 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:06 crc kubenswrapper[4738]: I0307 07:04:06.461761 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:06 crc kubenswrapper[4738]: I0307 07:04:06.681191 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-wwk6z" Mar 07 07:04:07 crc kubenswrapper[4738]: I0307 07:04:07.460654 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:07 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:07 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:07 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:07 crc kubenswrapper[4738]: I0307 07:04:07.460710 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:08 crc kubenswrapper[4738]: I0307 07:04:08.462298 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:08 crc kubenswrapper[4738]: [-]has-synced failed: reason withheld Mar 07 07:04:08 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:08 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:08 crc kubenswrapper[4738]: I0307 07:04:08.462954 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:09 crc kubenswrapper[4738]: I0307 07:04:09.460260 4738 patch_prober.go:28] interesting pod/router-default-5444994796-k9g6h container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 07:04:09 crc kubenswrapper[4738]: [+]has-synced ok Mar 07 07:04:09 crc kubenswrapper[4738]: [+]process-running ok Mar 07 07:04:09 crc kubenswrapper[4738]: healthz check failed Mar 07 07:04:09 crc kubenswrapper[4738]: I0307 07:04:09.460358 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-k9g6h" podUID="f752fd6d-5074-4f1f-ae2e-f1c4225536f6" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 07:04:10 crc kubenswrapper[4738]: W0307 07:04:10.297853 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda41584d7_5e09_4e4b_9b1d_6aa4b7d13039.slice/crio-aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f WatchSource:0}: Error finding container aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f: Status 404 returned error can't find the container with id aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.372723 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.463987 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.465998 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-k9g6h" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.481062 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access\") pod \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.481151 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir\") pod \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\" (UID: \"fab6aabb-c230-4043-94b0-b3ab1e6a8891\") " Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.481680 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fab6aabb-c230-4043-94b0-b3ab1e6a8891" (UID: "fab6aabb-c230-4043-94b0-b3ab1e6a8891"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.491790 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fab6aabb-c230-4043-94b0-b3ab1e6a8891" (UID: "fab6aabb-c230-4043-94b0-b3ab1e6a8891"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.583351 4738 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.583390 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fab6aabb-c230-4043-94b0-b3ab1e6a8891-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.675562 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"fab6aabb-c230-4043-94b0-b3ab1e6a8891","Type":"ContainerDied","Data":"2f434fe410df71475c39b43d9156431d7783e579a278d8c20e1cd92cf5091974"} Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.675653 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f434fe410df71475c39b43d9156431d7783e579a278d8c20e1cd92cf5091974" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.675805 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 07 07:04:10 crc kubenswrapper[4738]: I0307 07:04:10.678875 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" event={"ID":"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039","Type":"ContainerStarted","Data":"aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f"} Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.015219 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.016001 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" podUID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" containerName="controller-manager" containerID="cri-o://845ac5a2a2705e56998b6143ff6c4333356f196445bd7839a37d53551017f09a" gracePeriod=30 Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.044748 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.045222 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerName="route-controller-manager" containerID="cri-o://b410e5d6dde4b1b3270ed83fb1769e66989622d81d5915d449f20bac269525c0" gracePeriod=30 Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.702801 4738 generic.go:334] "Generic (PLEG): container finished" podID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerID="b410e5d6dde4b1b3270ed83fb1769e66989622d81d5915d449f20bac269525c0" exitCode=0 Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.702892 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" event={"ID":"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b","Type":"ContainerDied","Data":"b410e5d6dde4b1b3270ed83fb1769e66989622d81d5915d449f20bac269525c0"} Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.704445 4738 generic.go:334] "Generic (PLEG): container finished" podID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" containerID="845ac5a2a2705e56998b6143ff6c4333356f196445bd7839a37d53551017f09a" exitCode=0 Mar 07 07:04:14 crc kubenswrapper[4738]: I0307 07:04:14.704481 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" event={"ID":"28c5fa97-13ce-4c5c-83b5-fdd02611de9c","Type":"ContainerDied","Data":"845ac5a2a2705e56998b6143ff6c4333356f196445bd7839a37d53551017f09a"} Mar 07 07:04:15 crc kubenswrapper[4738]: E0307 07:04:15.707103 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 07 07:04:15 crc kubenswrapper[4738]: E0307 07:04:15.707549 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:04:15 crc kubenswrapper[4738]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 07 07:04:15 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wmxrs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29547782-mrbc5_openshift-infra(09f978c5-7fd4-4852-95c4-915304c1bf18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 07 07:04:15 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:04:15 crc kubenswrapper[4738]: E0307 07:04:15.710574 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.079778 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.208738 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.240347 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:16 crc kubenswrapper[4738]: E0307 07:04:16.241361 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fab6aabb-c230-4043-94b0-b3ab1e6a8891" containerName="pruner" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.241376 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="fab6aabb-c230-4043-94b0-b3ab1e6a8891" containerName="pruner" Mar 07 07:04:16 crc kubenswrapper[4738]: E0307 07:04:16.241395 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" containerName="controller-manager" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.241401 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" containerName="controller-manager" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.241500 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="fab6aabb-c230-4043-94b0-b3ab1e6a8891" containerName="pruner" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.241514 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" containerName="controller-manager" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.241921 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.253784 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.288918 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles\") pod \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.288966 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert\") pod \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.289011 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config\") pod \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.289076 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhnsz\" (UniqueName: \"kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz\") pod \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.289130 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca\") pod \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\" (UID: \"28c5fa97-13ce-4c5c-83b5-fdd02611de9c\") " Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.290145 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca" (OuterVolumeSpecName: "client-ca") pod "28c5fa97-13ce-4c5c-83b5-fdd02611de9c" (UID: "28c5fa97-13ce-4c5c-83b5-fdd02611de9c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.290293 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "28c5fa97-13ce-4c5c-83b5-fdd02611de9c" (UID: "28c5fa97-13ce-4c5c-83b5-fdd02611de9c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.290321 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config" (OuterVolumeSpecName: "config") pod "28c5fa97-13ce-4c5c-83b5-fdd02611de9c" (UID: "28c5fa97-13ce-4c5c-83b5-fdd02611de9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.299609 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "28c5fa97-13ce-4c5c-83b5-fdd02611de9c" (UID: "28c5fa97-13ce-4c5c-83b5-fdd02611de9c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.299882 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz" (OuterVolumeSpecName: "kube-api-access-qhnsz") pod "28c5fa97-13ce-4c5c-83b5-fdd02611de9c" (UID: "28c5fa97-13ce-4c5c-83b5-fdd02611de9c"). InnerVolumeSpecName "kube-api-access-qhnsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391184 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kxbs\" (UniqueName: \"kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391243 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391268 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391353 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391388 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391471 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391487 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391560 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391604 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.391620 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhnsz\" (UniqueName: \"kubernetes.io/projected/28c5fa97-13ce-4c5c-83b5-fdd02611de9c-kube-api-access-qhnsz\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.425037 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.431744 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-rvllr" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.498664 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.498873 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.498977 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.499038 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kxbs\" (UniqueName: \"kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.499071 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.500631 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.501235 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.501260 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.505861 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.520125 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kxbs\" (UniqueName: \"kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs\") pod \"controller-manager-79f5cfd579-pmfw8\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.572467 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.721594 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.721488 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58445c889b-vqh7f" event={"ID":"28c5fa97-13ce-4c5c-83b5-fdd02611de9c","Type":"ContainerDied","Data":"0136e5e3e41afe9f53d74f1f3f8e996888f378737ac31263511ca611cacf018d"} Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.721902 4738 scope.go:117] "RemoveContainer" containerID="845ac5a2a2705e56998b6143ff6c4333356f196445bd7839a37d53551017f09a" Mar 07 07:04:16 crc kubenswrapper[4738]: E0307 07:04:16.724606 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.758789 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:04:16 crc kubenswrapper[4738]: I0307 07:04:16.761877 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58445c889b-vqh7f"] Mar 07 07:04:18 crc kubenswrapper[4738]: I0307 07:04:18.007834 4738 patch_prober.go:28] interesting pod/route-controller-manager-6c98444477-mnjxk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: i/o timeout" start-of-body= Mar 07 07:04:18 crc kubenswrapper[4738]: I0307 07:04:18.008260 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: i/o timeout" Mar 07 07:04:18 crc kubenswrapper[4738]: I0307 07:04:18.392819 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28c5fa97-13ce-4c5c-83b5-fdd02611de9c" path="/var/lib/kubelet/pods/28c5fa97-13ce-4c5c-83b5-fdd02611de9c/volumes" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.723458 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.768658 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:21 crc kubenswrapper[4738]: E0307 07:04:21.769055 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerName="route-controller-manager" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.769073 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerName="route-controller-manager" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.769329 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" containerName="route-controller-manager" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.769962 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.785920 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.789904 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" event={"ID":"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b","Type":"ContainerDied","Data":"d941e594eb2d74c696d3501c8326b5f3fad891590a0205193e31c65719f479b1"} Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.790273 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.891581 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca\") pod \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.891689 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config\") pod \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.891742 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ms4z7\" (UniqueName: \"kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7\") pod \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.891860 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert\") pod \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\" (UID: \"0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b\") " Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.892032 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjk9g\" (UniqueName: \"kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.892134 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.892207 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.892235 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.893445 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca" (OuterVolumeSpecName: "client-ca") pod "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" (UID: "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.893891 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config" (OuterVolumeSpecName: "config") pod "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" (UID: "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.910204 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" (UID: "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.915233 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7" (OuterVolumeSpecName: "kube-api-access-ms4z7") pod "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" (UID: "0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b"). InnerVolumeSpecName "kube-api-access-ms4z7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.993509 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjk9g\" (UniqueName: \"kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.993641 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.993734 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.993766 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.994135 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.994171 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.994182 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.994196 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ms4z7\" (UniqueName: \"kubernetes.io/projected/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b-kube-api-access-ms4z7\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.995456 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:21 crc kubenswrapper[4738]: I0307 07:04:21.997347 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.010353 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.014780 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjk9g\" (UniqueName: \"kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g\") pod \"route-controller-manager-644fbb89fd-dj65t\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.103768 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.126379 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.131081 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6c98444477-mnjxk"] Mar 07 07:04:22 crc kubenswrapper[4738]: I0307 07:04:22.394066 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b" path="/var/lib/kubelet/pods/0d54ebb3-2fee-40e7-bd09-a3d3a6d6018b/volumes" Mar 07 07:04:24 crc kubenswrapper[4738]: I0307 07:04:24.072259 4738 ???:1] "http: TLS handshake error from 192.168.126.11:36894: no serving certificate available for the kubelet" Mar 07 07:04:26 crc kubenswrapper[4738]: I0307 07:04:26.957760 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:04:26 crc kubenswrapper[4738]: I0307 07:04:26.957959 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:04:27 crc kubenswrapper[4738]: I0307 07:04:27.039101 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-kf278" Mar 07 07:04:28 crc kubenswrapper[4738]: I0307 07:04:28.431464 4738 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","pod7f0de6fb-1e9a-4678-821a-533eb8bb9766"] err="unable to destroy cgroup paths for cgroup [kubepods pod7f0de6fb-1e9a-4678-821a-533eb8bb9766] : Timed out while waiting for systemd to remove kubepods-pod7f0de6fb_1e9a_4678_821a_533eb8bb9766.slice" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.527147 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.529738 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.534335 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.534763 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.545095 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.628094 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.628143 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.730090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.730185 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.730330 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.753063 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:29 crc kubenswrapper[4738]: I0307 07:04:29.854530 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:31 crc kubenswrapper[4738]: E0307 07:04:31.016149 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 07:04:31 crc kubenswrapper[4738]: E0307 07:04:31.016351 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zbhg8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-t6bh9_openshift-marketplace(f34dc9c2-f90f-473e-8dca-a3df3f70e02f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:31 crc kubenswrapper[4738]: E0307 07:04:31.017533 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-t6bh9" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" Mar 07 07:04:32 crc kubenswrapper[4738]: E0307 07:04:32.084644 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-t6bh9" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" Mar 07 07:04:32 crc kubenswrapper[4738]: E0307 07:04:32.332218 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 07:04:32 crc kubenswrapper[4738]: E0307 07:04:32.332420 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6kwcq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-587ww_openshift-marketplace(b82c0da7-caec-462d-85e9-f3c45cc042b5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:32 crc kubenswrapper[4738]: E0307 07:04:32.333626 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-587ww" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" Mar 07 07:04:33 crc kubenswrapper[4738]: I0307 07:04:33.924280 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 07:04:33 crc kubenswrapper[4738]: I0307 07:04:33.925811 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:33 crc kubenswrapper[4738]: I0307 07:04:33.934340 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.076237 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.097755 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.098067 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.098206 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.155982 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.199565 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.199867 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.199871 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-587ww" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.200049 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.200312 4738 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 07:04:34 crc kubenswrapper[4738]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 07 07:04:34 crc kubenswrapper[4738]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dg4q9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29547784-vqg4n_openshift-infra(a41584d7-5e09-4e4b-9b1d-6aa4b7d13039): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 07 07:04:34 crc kubenswrapper[4738]: > logger="UnhandledError" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.200339 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.200514 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.200600 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.201537 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.228609 4738 scope.go:117] "RemoveContainer" containerID="b410e5d6dde4b1b3270ed83fb1769e66989622d81d5915d449f20bac269525c0" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.229296 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access\") pod \"installer-9-crc\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.267923 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.268650 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x2w9b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rnt9k_openshift-marketplace(4ccab57d-f355-494a-adae-5a1dba9c360a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.269858 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rnt9k" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" Mar 07 07:04:34 crc kubenswrapper[4738]: I0307 07:04:34.348549 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.823244 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.823444 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lvzg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-67hw7_openshift-marketplace(11b75931-7617-4db8-b9e0-62a32ccd6948): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.824682 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-67hw7" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" Mar 07 07:04:34 crc kubenswrapper[4738]: E0307 07:04:34.879007 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" Mar 07 07:04:35 crc kubenswrapper[4738]: E0307 07:04:35.790255 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 07 07:04:35 crc kubenswrapper[4738]: E0307 07:04:35.790466 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5l74,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-k4rs8_openshift-marketplace(ef082c9b-8cc2-4c38-8957-28912470b473): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:35 crc kubenswrapper[4738]: E0307 07:04:35.791650 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-k4rs8" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" Mar 07 07:04:36 crc kubenswrapper[4738]: E0307 07:04:36.120314 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 07 07:04:36 crc kubenswrapper[4738]: E0307 07:04:36.120520 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zd8gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-flk6x_openshift-marketplace(b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:36 crc kubenswrapper[4738]: E0307 07:04:36.121794 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-flk6x" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.278334 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rnt9k" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.278371 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-flk6x" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.278457 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-67hw7" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.278590 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-k4rs8" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.493749 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.494211 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4l6n8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mmvgq_openshift-marketplace(7e052e6e-6b6d-47c5-a745-a1db1cab9669): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.495519 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mmvgq" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.587389 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.845767 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.849052 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:38 crc kubenswrapper[4738]: W0307 07:04:38.855406 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod03158b9d_6cc1_4f5d_af1f_d21de41d536d.slice/crio-b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33 WatchSource:0}: Error finding container b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33: Status 404 returned error can't find the container with id b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33 Mar 07 07:04:38 crc kubenswrapper[4738]: W0307 07:04:38.864638 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b021d4e_c013_40ae_8601_9aa2dd013daa.slice/crio-ddbe6dc4d735f8001cabc7617beb230ad966a6b548901729ad9c2b32954124fb WatchSource:0}: Error finding container ddbe6dc4d735f8001cabc7617beb230ad966a6b548901729ad9c2b32954124fb: Status 404 returned error can't find the container with id ddbe6dc4d735f8001cabc7617beb230ad966a6b548901729ad9c2b32954124fb Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.876555 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:38 crc kubenswrapper[4738]: W0307 07:04:38.896014 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedea93fc_1b38_49c3_aaff_696c5cd2cb93.slice/crio-d574554a70a7f55054f5ce33b753f66eec2e2a136d47024b6741b3f0d198141a WatchSource:0}: Error finding container d574554a70a7f55054f5ce33b753f66eec2e2a136d47024b6741b3f0d198141a: Status 404 returned error can't find the container with id d574554a70a7f55054f5ce33b753f66eec2e2a136d47024b6741b3f0d198141a Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.923129 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a77c47d5-392b-4957-8db4-5665a3bb304a","Type":"ContainerStarted","Data":"b539117446f4167ebf753c099c949d03a8607159e57154d4b960feb0af46cb52"} Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.927717 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" event={"ID":"2b021d4e-c013-40ae-8601-9aa2dd013daa","Type":"ContainerStarted","Data":"ddbe6dc4d735f8001cabc7617beb230ad966a6b548901729ad9c2b32954124fb"} Mar 07 07:04:38 crc kubenswrapper[4738]: I0307 07:04:38.929287 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03158b9d-6cc1-4f5d-af1f-d21de41d536d","Type":"ContainerStarted","Data":"b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33"} Mar 07 07:04:38 crc kubenswrapper[4738]: E0307 07:04:38.934296 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mmvgq" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" Mar 07 07:04:39 crc kubenswrapper[4738]: I0307 07:04:39.937225 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" event={"ID":"edea93fc-1b38-49c3-aaff-696c5cd2cb93","Type":"ContainerStarted","Data":"d574554a70a7f55054f5ce33b753f66eec2e2a136d47024b6741b3f0d198141a"} Mar 07 07:04:40 crc kubenswrapper[4738]: I0307 07:04:40.946244 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" event={"ID":"edea93fc-1b38-49c3-aaff-696c5cd2cb93","Type":"ContainerStarted","Data":"9c3fa2e156901481bf1521133538cc5dd83dfc268a32aa7eecbe7788b49444fd"} Mar 07 07:04:41 crc kubenswrapper[4738]: E0307 07:04:41.802743 4738 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 07 07:04:41 crc kubenswrapper[4738]: E0307 07:04:41.803282 4738 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-m8ph6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-gfmjp_openshift-marketplace(2731d6f8-adb2-4068-a3c9-162cfcb7de07): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 07 07:04:41 crc kubenswrapper[4738]: E0307 07:04:41.804917 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-gfmjp" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.953612 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" event={"ID":"2b021d4e-c013-40ae-8601-9aa2dd013daa","Type":"ContainerStarted","Data":"1ef5a6aa12a09e98379be0ba77be5385e6cc18669f4cf502bb3e8dcf014ccf0c"} Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.953816 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" podUID="2b021d4e-c013-40ae-8601-9aa2dd013daa" containerName="controller-manager" containerID="cri-o://1ef5a6aa12a09e98379be0ba77be5385e6cc18669f4cf502bb3e8dcf014ccf0c" gracePeriod=30 Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.954002 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.958206 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03158b9d-6cc1-4f5d-af1f-d21de41d536d","Type":"ContainerStarted","Data":"f65517aa8bce3e2b0b5daf26c3cabec7932e713546fd6b2c4c1b0833f5188e14"} Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.961359 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a77c47d5-392b-4957-8db4-5665a3bb304a","Type":"ContainerStarted","Data":"0063cc03766faa907984a181cf5aefcfa1b9aee12cd1bdbbe93dce68694479b0"} Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.961409 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerName="route-controller-manager" containerID="cri-o://9c3fa2e156901481bf1521133538cc5dd83dfc268a32aa7eecbe7788b49444fd" gracePeriod=30 Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.961989 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.968687 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.971492 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.984480 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" podStartSLOduration=27.984456193 podStartE2EDuration="27.984456193s" podCreationTimestamp="2026-03-07 07:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:41.980770963 +0000 UTC m=+300.445758284" watchObservedRunningTime="2026-03-07 07:04:41.984456193 +0000 UTC m=+300.449443514" Mar 07 07:04:41 crc kubenswrapper[4738]: I0307 07:04:41.999468 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=12.999444948 podStartE2EDuration="12.999444948s" podCreationTimestamp="2026-03-07 07:04:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:41.997886845 +0000 UTC m=+300.462874166" watchObservedRunningTime="2026-03-07 07:04:41.999444948 +0000 UTC m=+300.464432269" Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.020182 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" podStartSLOduration=28.020146636 podStartE2EDuration="28.020146636s" podCreationTimestamp="2026-03-07 07:04:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:42.017201917 +0000 UTC m=+300.482189238" watchObservedRunningTime="2026-03-07 07:04:42.020146636 +0000 UTC m=+300.485133957" Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.104753 4738 patch_prober.go:28] interesting pod/route-controller-manager-644fbb89fd-dj65t container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.104842 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.124853 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.124831262 podStartE2EDuration="9.124831262s" podCreationTimestamp="2026-03-07 07:04:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:42.114663928 +0000 UTC m=+300.579651249" watchObservedRunningTime="2026-03-07 07:04:42.124831262 +0000 UTC m=+300.589818583" Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.969284 4738 generic.go:334] "Generic (PLEG): container finished" podID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerID="9c3fa2e156901481bf1521133538cc5dd83dfc268a32aa7eecbe7788b49444fd" exitCode=0 Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.969409 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" event={"ID":"edea93fc-1b38-49c3-aaff-696c5cd2cb93","Type":"ContainerDied","Data":"9c3fa2e156901481bf1521133538cc5dd83dfc268a32aa7eecbe7788b49444fd"} Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.971691 4738 generic.go:334] "Generic (PLEG): container finished" podID="a77c47d5-392b-4957-8db4-5665a3bb304a" containerID="0063cc03766faa907984a181cf5aefcfa1b9aee12cd1bdbbe93dce68694479b0" exitCode=0 Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.971745 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a77c47d5-392b-4957-8db4-5665a3bb304a","Type":"ContainerDied","Data":"0063cc03766faa907984a181cf5aefcfa1b9aee12cd1bdbbe93dce68694479b0"} Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.975080 4738 generic.go:334] "Generic (PLEG): container finished" podID="2b021d4e-c013-40ae-8601-9aa2dd013daa" containerID="1ef5a6aa12a09e98379be0ba77be5385e6cc18669f4cf502bb3e8dcf014ccf0c" exitCode=0 Mar 07 07:04:42 crc kubenswrapper[4738]: I0307 07:04:42.975214 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" event={"ID":"2b021d4e-c013-40ae-8601-9aa2dd013daa","Type":"ContainerDied","Data":"1ef5a6aa12a09e98379be0ba77be5385e6cc18669f4cf502bb3e8dcf014ccf0c"} Mar 07 07:04:42 crc kubenswrapper[4738]: E0307 07:04:42.981555 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-gfmjp" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.273487 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.299491 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.300474 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:04:43 crc kubenswrapper[4738]: E0307 07:04:43.300740 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerName="route-controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.300757 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerName="route-controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: E0307 07:04:43.300775 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b021d4e-c013-40ae-8601-9aa2dd013daa" containerName="controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.300782 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b021d4e-c013-40ae-8601-9aa2dd013daa" containerName="controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.300874 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" containerName="route-controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.300887 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b021d4e-c013-40ae-8601-9aa2dd013daa" containerName="controller-manager" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.301283 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316072 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config\") pod \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316190 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kxbs\" (UniqueName: \"kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs\") pod \"2b021d4e-c013-40ae-8601-9aa2dd013daa\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316273 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca\") pod \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316327 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjk9g\" (UniqueName: \"kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g\") pod \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316422 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert\") pod \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\" (UID: \"edea93fc-1b38-49c3-aaff-696c5cd2cb93\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316481 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles\") pod \"2b021d4e-c013-40ae-8601-9aa2dd013daa\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316525 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca\") pod \"2b021d4e-c013-40ae-8601-9aa2dd013daa\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.316573 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config\") pod \"2b021d4e-c013-40ae-8601-9aa2dd013daa\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319017 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert\") pod \"2b021d4e-c013-40ae-8601-9aa2dd013daa\" (UID: \"2b021d4e-c013-40ae-8601-9aa2dd013daa\") " Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319018 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2b021d4e-c013-40ae-8601-9aa2dd013daa" (UID: "2b021d4e-c013-40ae-8601-9aa2dd013daa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319547 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319588 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319619 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319733 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.319784 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5dzg\" (UniqueName: \"kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.325040 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.320484 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config" (OuterVolumeSpecName: "config") pod "edea93fc-1b38-49c3-aaff-696c5cd2cb93" (UID: "edea93fc-1b38-49c3-aaff-696c5cd2cb93"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.320942 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca" (OuterVolumeSpecName: "client-ca") pod "edea93fc-1b38-49c3-aaff-696c5cd2cb93" (UID: "edea93fc-1b38-49c3-aaff-696c5cd2cb93"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.321341 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca" (OuterVolumeSpecName: "client-ca") pod "2b021d4e-c013-40ae-8601-9aa2dd013daa" (UID: "2b021d4e-c013-40ae-8601-9aa2dd013daa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.321831 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config" (OuterVolumeSpecName: "config") pod "2b021d4e-c013-40ae-8601-9aa2dd013daa" (UID: "2b021d4e-c013-40ae-8601-9aa2dd013daa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.325480 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2b021d4e-c013-40ae-8601-9aa2dd013daa" (UID: "2b021d4e-c013-40ae-8601-9aa2dd013daa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.325850 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "edea93fc-1b38-49c3-aaff-696c5cd2cb93" (UID: "edea93fc-1b38-49c3-aaff-696c5cd2cb93"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.326021 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.326474 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g" (OuterVolumeSpecName: "kube-api-access-zjk9g") pod "edea93fc-1b38-49c3-aaff-696c5cd2cb93" (UID: "edea93fc-1b38-49c3-aaff-696c5cd2cb93"). InnerVolumeSpecName "kube-api-access-zjk9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.329629 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs" (OuterVolumeSpecName: "kube-api-access-6kxbs") pod "2b021d4e-c013-40ae-8601-9aa2dd013daa" (UID: "2b021d4e-c013-40ae-8601-9aa2dd013daa"). InnerVolumeSpecName "kube-api-access-6kxbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426733 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426779 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426800 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426858 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426893 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5dzg\" (UniqueName: \"kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426953 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/edea93fc-1b38-49c3-aaff-696c5cd2cb93-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426989 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.426999 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2b021d4e-c013-40ae-8601-9aa2dd013daa-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.427008 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2b021d4e-c013-40ae-8601-9aa2dd013daa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.427017 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.427026 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kxbs\" (UniqueName: \"kubernetes.io/projected/2b021d4e-c013-40ae-8601-9aa2dd013daa-kube-api-access-6kxbs\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.427036 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/edea93fc-1b38-49c3-aaff-696c5cd2cb93-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.427045 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjk9g\" (UniqueName: \"kubernetes.io/projected/edea93fc-1b38-49c3-aaff-696c5cd2cb93-kube-api-access-zjk9g\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.428723 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.429105 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.429126 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.433042 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.445770 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5dzg\" (UniqueName: \"kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg\") pod \"controller-manager-df74875fc-75rml\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.642840 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:43 crc kubenswrapper[4738]: I0307 07:04:43.943503 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:04:43 crc kubenswrapper[4738]: W0307 07:04:43.956705 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83c9105f_ba15_4aa4_8310_cf1616dd359a.slice/crio-1a1f5195758e9b34de82fe47b0aefff5a89c972037b39dcad0342e31d8285d37 WatchSource:0}: Error finding container 1a1f5195758e9b34de82fe47b0aefff5a89c972037b39dcad0342e31d8285d37: Status 404 returned error can't find the container with id 1a1f5195758e9b34de82fe47b0aefff5a89c972037b39dcad0342e31d8285d37 Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.000990 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" event={"ID":"edea93fc-1b38-49c3-aaff-696c5cd2cb93","Type":"ContainerDied","Data":"d574554a70a7f55054f5ce33b753f66eec2e2a136d47024b6741b3f0d198141a"} Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.001109 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.001243 4738 scope.go:117] "RemoveContainer" containerID="9c3fa2e156901481bf1521133538cc5dd83dfc268a32aa7eecbe7788b49444fd" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.005074 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" event={"ID":"83c9105f-ba15-4aa4-8310-cf1616dd359a","Type":"ContainerStarted","Data":"1a1f5195758e9b34de82fe47b0aefff5a89c972037b39dcad0342e31d8285d37"} Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.008423 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" event={"ID":"2b021d4e-c013-40ae-8601-9aa2dd013daa","Type":"ContainerDied","Data":"ddbe6dc4d735f8001cabc7617beb230ad966a6b548901729ad9c2b32954124fb"} Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.008465 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79f5cfd579-pmfw8" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.029842 4738 scope.go:117] "RemoveContainer" containerID="1ef5a6aa12a09e98379be0ba77be5385e6cc18669f4cf502bb3e8dcf014ccf0c" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.060541 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.070945 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-644fbb89fd-dj65t"] Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.077210 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.080577 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79f5cfd579-pmfw8"] Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.340441 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.393419 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b021d4e-c013-40ae-8601-9aa2dd013daa" path="/var/lib/kubelet/pods/2b021d4e-c013-40ae-8601-9aa2dd013daa/volumes" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.394240 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edea93fc-1b38-49c3-aaff-696c5cd2cb93" path="/var/lib/kubelet/pods/edea93fc-1b38-49c3-aaff-696c5cd2cb93/volumes" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.441938 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir\") pod \"a77c47d5-392b-4957-8db4-5665a3bb304a\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.442019 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access\") pod \"a77c47d5-392b-4957-8db4-5665a3bb304a\" (UID: \"a77c47d5-392b-4957-8db4-5665a3bb304a\") " Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.442058 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a77c47d5-392b-4957-8db4-5665a3bb304a" (UID: "a77c47d5-392b-4957-8db4-5665a3bb304a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.442501 4738 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a77c47d5-392b-4957-8db4-5665a3bb304a-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.451980 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a77c47d5-392b-4957-8db4-5665a3bb304a" (UID: "a77c47d5-392b-4957-8db4-5665a3bb304a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.544448 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a77c47d5-392b-4957-8db4-5665a3bb304a-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.655036 4738 csr.go:261] certificate signing request csr-dc9tx is approved, waiting to be issued Mar 07 07:04:44 crc kubenswrapper[4738]: I0307 07:04:44.661809 4738 csr.go:257] certificate signing request csr-dc9tx is issued Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.016145 4738 generic.go:334] "Generic (PLEG): container finished" podID="09f978c5-7fd4-4852-95c4-915304c1bf18" containerID="fca17f0eff9dfb92737a52be4b29ea4279eb260fc2eec54cab232cb730e635af" exitCode=0 Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.016248 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" event={"ID":"09f978c5-7fd4-4852-95c4-915304c1bf18","Type":"ContainerDied","Data":"fca17f0eff9dfb92737a52be4b29ea4279eb260fc2eec54cab232cb730e635af"} Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.021506 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"a77c47d5-392b-4957-8db4-5665a3bb304a","Type":"ContainerDied","Data":"b539117446f4167ebf753c099c949d03a8607159e57154d4b960feb0af46cb52"} Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.021563 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b539117446f4167ebf753c099c949d03a8607159e57154d4b960feb0af46cb52" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.021605 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.028405 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" event={"ID":"83c9105f-ba15-4aa4-8310-cf1616dd359a","Type":"ContainerStarted","Data":"a713eab4e3ae69458ebcf81e5a3b8741157d614608ead4947f69e82f8662a06f"} Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.029151 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.035378 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.053284 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" podStartSLOduration=11.053263988 podStartE2EDuration="11.053263988s" podCreationTimestamp="2026-03-07 07:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:45.048973272 +0000 UTC m=+303.513960593" watchObservedRunningTime="2026-03-07 07:04:45.053263988 +0000 UTC m=+303.518251309" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.662662 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.663215 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-15 10:29:42.211657636 +0000 UTC Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.663248 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7539h24m56.548412522s for next certificate rotation Mar 07 07:04:45 crc kubenswrapper[4738]: E0307 07:04:45.663411 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a77c47d5-392b-4957-8db4-5665a3bb304a" containerName="pruner" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.663429 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a77c47d5-392b-4957-8db4-5665a3bb304a" containerName="pruner" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.663560 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a77c47d5-392b-4957-8db4-5665a3bb304a" containerName="pruner" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.664082 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.666415 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.666762 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.666954 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.667087 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.667509 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.667665 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.673295 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.762318 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.762419 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.762527 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.762712 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhtl\" (UniqueName: \"kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.863581 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.864149 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhtl\" (UniqueName: \"kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.864345 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.864664 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.865490 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.866282 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.871463 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.883582 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhtl\" (UniqueName: \"kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl\") pod \"route-controller-manager-748655f4c9-mnc6l\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:45 crc kubenswrapper[4738]: I0307 07:04:45.983958 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.037300 4738 generic.go:334] "Generic (PLEG): container finished" podID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerID="baef5f61b5c6a47da32c50681175f6adc2a830d69eda495f46c36fa665ff6691" exitCode=0 Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.037359 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerDied","Data":"baef5f61b5c6a47da32c50681175f6adc2a830d69eda495f46c36fa665ff6691"} Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.305548 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.373434 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wmxrs\" (UniqueName: \"kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs\") pod \"09f978c5-7fd4-4852-95c4-915304c1bf18\" (UID: \"09f978c5-7fd4-4852-95c4-915304c1bf18\") " Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.382057 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs" (OuterVolumeSpecName: "kube-api-access-wmxrs") pod "09f978c5-7fd4-4852-95c4-915304c1bf18" (UID: "09f978c5-7fd4-4852-95c4-915304c1bf18"). InnerVolumeSpecName "kube-api-access-wmxrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.466101 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.474908 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wmxrs\" (UniqueName: \"kubernetes.io/projected/09f978c5-7fd4-4852-95c4-915304c1bf18-kube-api-access-wmxrs\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:46 crc kubenswrapper[4738]: W0307 07:04:46.478127 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f1ba920_86ac_4a49_80e7_356fad40b91b.slice/crio-090b57359839aacf04bedab27a880b2ae8b35050fe9b235b566ae8c536b9eac7 WatchSource:0}: Error finding container 090b57359839aacf04bedab27a880b2ae8b35050fe9b235b566ae8c536b9eac7: Status 404 returned error can't find the container with id 090b57359839aacf04bedab27a880b2ae8b35050fe9b235b566ae8c536b9eac7 Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.663655 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-12 08:11:45.886059401 +0000 UTC Mar 07 07:04:46 crc kubenswrapper[4738]: I0307 07:04:46.664371 4738 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6721h6m59.221696405s for next certificate rotation Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.044025 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" event={"ID":"09f978c5-7fd4-4852-95c4-915304c1bf18","Type":"ContainerDied","Data":"60a375b1357d468345d00821cfa7fac44badfc3c83b95c00f04fc35f5d8ad0d9"} Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.044073 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60a375b1357d468345d00821cfa7fac44badfc3c83b95c00f04fc35f5d8ad0d9" Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.044158 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547782-mrbc5" Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.051719 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerStarted","Data":"d5b52c7a1fae65f0a2915cfbf369956f042ec4e9468ba591e2b6e6d7fb92015e"} Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.053307 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" event={"ID":"1f1ba920-86ac-4a49-80e7-356fad40b91b","Type":"ContainerStarted","Data":"5033f8785c0dea8d9cb20ae65fa2d765ccde48d713d62a92cae05668cccf5425"} Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.053467 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.053554 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" event={"ID":"1f1ba920-86ac-4a49-80e7-356fad40b91b","Type":"ContainerStarted","Data":"090b57359839aacf04bedab27a880b2ae8b35050fe9b235b566ae8c536b9eac7"} Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.059892 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.078617 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" podStartSLOduration=13.078600428 podStartE2EDuration="13.078600428s" podCreationTimestamp="2026-03-07 07:04:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:04:47.076553382 +0000 UTC m=+305.541540703" watchObservedRunningTime="2026-03-07 07:04:47.078600428 +0000 UTC m=+305.543587749" Mar 07 07:04:47 crc kubenswrapper[4738]: I0307 07:04:47.102786 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-t6bh9" podStartSLOduration=2.372268871 podStartE2EDuration="53.10276435s" podCreationTimestamp="2026-03-07 07:03:54 +0000 UTC" firstStartedPulling="2026-03-07 07:03:55.969300684 +0000 UTC m=+254.434288005" lastFinishedPulling="2026-03-07 07:04:46.699796163 +0000 UTC m=+305.164783484" observedRunningTime="2026-03-07 07:04:47.100254182 +0000 UTC m=+305.565241513" watchObservedRunningTime="2026-03-07 07:04:47.10276435 +0000 UTC m=+305.567751681" Mar 07 07:04:49 crc kubenswrapper[4738]: I0307 07:04:49.065531 4738 generic.go:334] "Generic (PLEG): container finished" podID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerID="7ec70bde127e5efa595a79991ae933070bbc3143eb92f970d10106225c9843b0" exitCode=0 Mar 07 07:04:49 crc kubenswrapper[4738]: I0307 07:04:49.065602 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerDied","Data":"7ec70bde127e5efa595a79991ae933070bbc3143eb92f970d10106225c9843b0"} Mar 07 07:04:49 crc kubenswrapper[4738]: I0307 07:04:49.067640 4738 generic.go:334] "Generic (PLEG): container finished" podID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" containerID="872d29a6f1ca6bde82c01c7da420f4a8dcad68c5b40f2d47a1be8a69e4b721e6" exitCode=0 Mar 07 07:04:49 crc kubenswrapper[4738]: I0307 07:04:49.067695 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" event={"ID":"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039","Type":"ContainerDied","Data":"872d29a6f1ca6bde82c01c7da420f4a8dcad68c5b40f2d47a1be8a69e4b721e6"} Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.076408 4738 generic.go:334] "Generic (PLEG): container finished" podID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerID="f85e7306c0fe9641cacc9ce61234d007671bac5fc94b72ed6c37ad81f0fd4ccb" exitCode=0 Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.076494 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerDied","Data":"f85e7306c0fe9641cacc9ce61234d007671bac5fc94b72ed6c37ad81f0fd4ccb"} Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.079262 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerStarted","Data":"897ac86f59ba3751505c36a9676f6c3a0f804e530e6fe99db330195d77a63ea8"} Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.500416 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.517675 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-587ww" podStartSLOduration=4.154698632 podStartE2EDuration="56.517654768s" podCreationTimestamp="2026-03-07 07:03:54 +0000 UTC" firstStartedPulling="2026-03-07 07:03:57.132328905 +0000 UTC m=+255.597316216" lastFinishedPulling="2026-03-07 07:04:49.495285021 +0000 UTC m=+307.960272352" observedRunningTime="2026-03-07 07:04:50.134428173 +0000 UTC m=+308.599415514" watchObservedRunningTime="2026-03-07 07:04:50.517654768 +0000 UTC m=+308.982642089" Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.641546 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dg4q9\" (UniqueName: \"kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9\") pod \"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039\" (UID: \"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039\") " Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.648926 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9" (OuterVolumeSpecName: "kube-api-access-dg4q9") pod "a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" (UID: "a41584d7-5e09-4e4b-9b1d-6aa4b7d13039"). InnerVolumeSpecName "kube-api-access-dg4q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:04:50 crc kubenswrapper[4738]: I0307 07:04:50.743530 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dg4q9\" (UniqueName: \"kubernetes.io/projected/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039-kube-api-access-dg4q9\") on node \"crc\" DevicePath \"\"" Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.090138 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerStarted","Data":"876bf0bfd99e9f59e4ce96d97e0a1b03b380e51fb8e6acc67097ce1612d9d83c"} Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.094571 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" event={"ID":"a41584d7-5e09-4e4b-9b1d-6aa4b7d13039","Type":"ContainerDied","Data":"aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f"} Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.094634 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa131226abe5e48d01625859db4abb219c9eecd2fe33aaac2269a967791d052f" Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.094736 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547784-vqg4n" Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.098504 4738 generic.go:334] "Generic (PLEG): container finished" podID="ef082c9b-8cc2-4c38-8957-28912470b473" containerID="0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a" exitCode=0 Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.098563 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerDied","Data":"0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a"} Mar 07 07:04:51 crc kubenswrapper[4738]: I0307 07:04:51.115787 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rnt9k" podStartSLOduration=3.038429413 podStartE2EDuration="55.115764772s" podCreationTimestamp="2026-03-07 07:03:56 +0000 UTC" firstStartedPulling="2026-03-07 07:03:58.394753608 +0000 UTC m=+256.859740929" lastFinishedPulling="2026-03-07 07:04:50.472088967 +0000 UTC m=+308.937076288" observedRunningTime="2026-03-07 07:04:51.111418725 +0000 UTC m=+309.576406046" watchObservedRunningTime="2026-03-07 07:04:51.115764772 +0000 UTC m=+309.580752093" Mar 07 07:04:52 crc kubenswrapper[4738]: I0307 07:04:52.107190 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerStarted","Data":"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187"} Mar 07 07:04:52 crc kubenswrapper[4738]: I0307 07:04:52.129969 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k4rs8" podStartSLOduration=3.771407705 podStartE2EDuration="58.129948747s" podCreationTimestamp="2026-03-07 07:03:54 +0000 UTC" firstStartedPulling="2026-03-07 07:03:57.132270353 +0000 UTC m=+255.597257674" lastFinishedPulling="2026-03-07 07:04:51.490811395 +0000 UTC m=+309.955798716" observedRunningTime="2026-03-07 07:04:52.127035639 +0000 UTC m=+310.592022960" watchObservedRunningTime="2026-03-07 07:04:52.129948747 +0000 UTC m=+310.594936068" Mar 07 07:04:53 crc kubenswrapper[4738]: I0307 07:04:53.115788 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerStarted","Data":"e20fa54a4e1b3410a9e5c3569544cd786b62cab5a2267cdbd99eab9a32b05438"} Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.123934 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerID="6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a" exitCode=0 Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.124017 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerDied","Data":"6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a"} Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.125979 4738 generic.go:334] "Generic (PLEG): container finished" podID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerID="e20fa54a4e1b3410a9e5c3569544cd786b62cab5a2267cdbd99eab9a32b05438" exitCode=0 Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.126047 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerDied","Data":"e20fa54a4e1b3410a9e5c3569544cd786b62cab5a2267cdbd99eab9a32b05438"} Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.130222 4738 generic.go:334] "Generic (PLEG): container finished" podID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerID="fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90" exitCode=0 Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.130267 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerDied","Data":"fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90"} Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.457096 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.457560 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.619608 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.619653 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.626115 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:04:54 crc kubenswrapper[4738]: I0307 07:04:54.668090 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.038184 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.038757 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.088726 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.139986 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerStarted","Data":"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d"} Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.143582 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerStarted","Data":"86bc2c43fd2ad7ad181510c495d17d8ba6838193a71fe35d7573c52eb2b3a81e"} Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.146131 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerStarted","Data":"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff"} Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.179818 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-flk6x" podStartSLOduration=3.6667529119999998 podStartE2EDuration="1m1.179789881s" podCreationTimestamp="2026-03-07 07:03:54 +0000 UTC" firstStartedPulling="2026-03-07 07:03:57.245976535 +0000 UTC m=+255.710963856" lastFinishedPulling="2026-03-07 07:04:54.759013504 +0000 UTC m=+313.224000825" observedRunningTime="2026-03-07 07:04:55.175657629 +0000 UTC m=+313.640644950" watchObservedRunningTime="2026-03-07 07:04:55.179789881 +0000 UTC m=+313.644777212" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.191708 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.194494 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.235539 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mmvgq" podStartSLOduration=12.559797512 podStartE2EDuration="58.235517896s" podCreationTimestamp="2026-03-07 07:03:57 +0000 UTC" firstStartedPulling="2026-03-07 07:04:08.928987865 +0000 UTC m=+267.393975186" lastFinishedPulling="2026-03-07 07:04:54.604708249 +0000 UTC m=+313.069695570" observedRunningTime="2026-03-07 07:04:55.233487091 +0000 UTC m=+313.698474422" watchObservedRunningTime="2026-03-07 07:04:55.235517896 +0000 UTC m=+313.700505217" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.238994 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-67hw7" podStartSLOduration=2.960068895 podStartE2EDuration="59.238977489s" podCreationTimestamp="2026-03-07 07:03:56 +0000 UTC" firstStartedPulling="2026-03-07 07:03:58.361655332 +0000 UTC m=+256.826642653" lastFinishedPulling="2026-03-07 07:04:54.640563936 +0000 UTC m=+313.105551247" observedRunningTime="2026-03-07 07:04:55.202464903 +0000 UTC m=+313.667452224" watchObservedRunningTime="2026-03-07 07:04:55.238977489 +0000 UTC m=+313.703964810" Mar 07 07:04:55 crc kubenswrapper[4738]: I0307 07:04:55.666822 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-7vtnd"] Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.582711 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.582796 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.641440 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.957663 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.957741 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.957797 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.958632 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:04:56 crc kubenswrapper[4738]: I0307 07:04:56.958713 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3" gracePeriod=600 Mar 07 07:04:57 crc kubenswrapper[4738]: I0307 07:04:57.022559 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:04:57 crc kubenswrapper[4738]: I0307 07:04:57.022661 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:04:57 crc kubenswrapper[4738]: I0307 07:04:57.195331 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.021278 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.022852 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.074122 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-67hw7" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="registry-server" probeResult="failure" output=< Mar 07 07:04:58 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:04:58 crc kubenswrapper[4738]: > Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.164910 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3" exitCode=0 Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.164989 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3"} Mar 07 07:04:58 crc kubenswrapper[4738]: I0307 07:04:58.165166 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df"} Mar 07 07:04:59 crc kubenswrapper[4738]: I0307 07:04:59.071193 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mmvgq" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="registry-server" probeResult="failure" output=< Mar 07 07:04:59 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:04:59 crc kubenswrapper[4738]: > Mar 07 07:05:01 crc kubenswrapper[4738]: I0307 07:05:01.181864 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerStarted","Data":"fe4de9ef7bba22b9e72537403649ec12746f991179432508152f8f20e2f6fb5d"} Mar 07 07:05:02 crc kubenswrapper[4738]: I0307 07:05:02.189780 4738 generic.go:334] "Generic (PLEG): container finished" podID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerID="fe4de9ef7bba22b9e72537403649ec12746f991179432508152f8f20e2f6fb5d" exitCode=0 Mar 07 07:05:02 crc kubenswrapper[4738]: I0307 07:05:02.189873 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerDied","Data":"fe4de9ef7bba22b9e72537403649ec12746f991179432508152f8f20e2f6fb5d"} Mar 07 07:05:03 crc kubenswrapper[4738]: I0307 07:05:03.200298 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerStarted","Data":"783cd768a6c50c8f4785a3e7d7a800f9fda331155faab9f13072762afb0b11c6"} Mar 07 07:05:03 crc kubenswrapper[4738]: I0307 07:05:03.222750 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gfmjp" podStartSLOduration=3.107437832 podStartE2EDuration="1m6.222730932s" podCreationTimestamp="2026-03-07 07:03:57 +0000 UTC" firstStartedPulling="2026-03-07 07:03:59.505928016 +0000 UTC m=+257.970915337" lastFinishedPulling="2026-03-07 07:05:02.621221116 +0000 UTC m=+321.086208437" observedRunningTime="2026-03-07 07:05:03.220467481 +0000 UTC m=+321.685454812" watchObservedRunningTime="2026-03-07 07:05:03.222730932 +0000 UTC m=+321.687718253" Mar 07 07:05:04 crc kubenswrapper[4738]: I0307 07:05:04.834984 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:04 crc kubenswrapper[4738]: I0307 07:05:04.836497 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:04 crc kubenswrapper[4738]: I0307 07:05:04.876188 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:05 crc kubenswrapper[4738]: I0307 07:05:05.078724 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:05:05 crc kubenswrapper[4738]: I0307 07:05:05.257050 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.058540 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.103133 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.314404 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.314631 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-flk6x" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="registry-server" containerID="cri-o://47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d" gracePeriod=2 Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.624714 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.624759 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:05:07 crc kubenswrapper[4738]: I0307 07:05:07.851419 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.024383 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities\") pod \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.024458 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd8gn\" (UniqueName: \"kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn\") pod \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.024547 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content\") pod \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\" (UID: \"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0\") " Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.025302 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities" (OuterVolumeSpecName: "utilities") pod "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" (UID: "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.040385 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn" (OuterVolumeSpecName: "kube-api-access-zd8gn") pod "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" (UID: "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0"). InnerVolumeSpecName "kube-api-access-zd8gn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.061844 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.101611 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" (UID: "b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.111593 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.125773 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.125807 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zd8gn\" (UniqueName: \"kubernetes.io/projected/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-kube-api-access-zd8gn\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.125816 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.226731 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerID="47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d" exitCode=0 Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.226794 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerDied","Data":"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d"} Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.226896 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-flk6x" event={"ID":"b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0","Type":"ContainerDied","Data":"1884d075e7c670c1618a31b9945e8bdb3f3f5615f09fc06cca1866ff2fd12c77"} Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.226950 4738 scope.go:117] "RemoveContainer" containerID="47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.227329 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-flk6x" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.241679 4738 scope.go:117] "RemoveContainer" containerID="6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.256651 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.265406 4738 scope.go:117] "RemoveContainer" containerID="b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.269771 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-flk6x"] Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.299816 4738 scope.go:117] "RemoveContainer" containerID="47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d" Mar 07 07:05:08 crc kubenswrapper[4738]: E0307 07:05:08.300389 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d\": container with ID starting with 47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d not found: ID does not exist" containerID="47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.300423 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d"} err="failed to get container status \"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d\": rpc error: code = NotFound desc = could not find container \"47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d\": container with ID starting with 47634ba2608ae9bf0a909c7badcf84abc46d9da4b043a538eb76bf24e1ccc57d not found: ID does not exist" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.300445 4738 scope.go:117] "RemoveContainer" containerID="6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a" Mar 07 07:05:08 crc kubenswrapper[4738]: E0307 07:05:08.300928 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a\": container with ID starting with 6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a not found: ID does not exist" containerID="6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.300950 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a"} err="failed to get container status \"6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a\": rpc error: code = NotFound desc = could not find container \"6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a\": container with ID starting with 6639eaf3ca924c63600fe96f1ab2d6d43c868d32fe13e94a721bba7714c6332a not found: ID does not exist" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.300966 4738 scope.go:117] "RemoveContainer" containerID="b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73" Mar 07 07:05:08 crc kubenswrapper[4738]: E0307 07:05:08.302484 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73\": container with ID starting with b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73 not found: ID does not exist" containerID="b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.302514 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73"} err="failed to get container status \"b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73\": rpc error: code = NotFound desc = could not find container \"b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73\": container with ID starting with b2aa0d20f9cafbe9163f5565774777fc8a178df6fc43d5408164bc81ec0eeb73 not found: ID does not exist" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.403507 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" path="/var/lib/kubelet/pods/b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0/volumes" Mar 07 07:05:08 crc kubenswrapper[4738]: I0307 07:05:08.697988 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gfmjp" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="registry-server" probeResult="failure" output=< Mar 07 07:05:08 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:05:08 crc kubenswrapper[4738]: > Mar 07 07:05:09 crc kubenswrapper[4738]: I0307 07:05:09.517379 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:05:09 crc kubenswrapper[4738]: I0307 07:05:09.517652 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k4rs8" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="registry-server" containerID="cri-o://e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187" gracePeriod=2 Mar 07 07:05:09 crc kubenswrapper[4738]: I0307 07:05:09.718853 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:05:09 crc kubenswrapper[4738]: I0307 07:05:09.720795 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mmvgq" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="registry-server" containerID="cri-o://6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff" gracePeriod=2 Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.049337 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.071410 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content\") pod \"ef082c9b-8cc2-4c38-8957-28912470b473\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.071481 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities\") pod \"ef082c9b-8cc2-4c38-8957-28912470b473\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.071569 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5l74\" (UniqueName: \"kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74\") pod \"ef082c9b-8cc2-4c38-8957-28912470b473\" (UID: \"ef082c9b-8cc2-4c38-8957-28912470b473\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.073256 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities" (OuterVolumeSpecName: "utilities") pod "ef082c9b-8cc2-4c38-8957-28912470b473" (UID: "ef082c9b-8cc2-4c38-8957-28912470b473"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.082301 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74" (OuterVolumeSpecName: "kube-api-access-d5l74") pod "ef082c9b-8cc2-4c38-8957-28912470b473" (UID: "ef082c9b-8cc2-4c38-8957-28912470b473"). InnerVolumeSpecName "kube-api-access-d5l74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.146446 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef082c9b-8cc2-4c38-8957-28912470b473" (UID: "ef082c9b-8cc2-4c38-8957-28912470b473"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.173078 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.173116 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef082c9b-8cc2-4c38-8957-28912470b473-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.173126 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5l74\" (UniqueName: \"kubernetes.io/projected/ef082c9b-8cc2-4c38-8957-28912470b473-kube-api-access-d5l74\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.231133 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.239596 4738 generic.go:334] "Generic (PLEG): container finished" podID="ef082c9b-8cc2-4c38-8957-28912470b473" containerID="e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187" exitCode=0 Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.239833 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerDied","Data":"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187"} Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.239989 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k4rs8" event={"ID":"ef082c9b-8cc2-4c38-8957-28912470b473","Type":"ContainerDied","Data":"e1304227800a2b091ce04b4f208ae652cdb6f61393ec12750bc74bcdf7ebd7c9"} Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.239864 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k4rs8" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.240077 4738 scope.go:117] "RemoveContainer" containerID="e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.242249 4738 generic.go:334] "Generic (PLEG): container finished" podID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerID="6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff" exitCode=0 Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.242284 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerDied","Data":"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff"} Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.242306 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mmvgq" event={"ID":"7e052e6e-6b6d-47c5-a745-a1db1cab9669","Type":"ContainerDied","Data":"c68a8c4b9b1c003e4ad1618884fdacba43bdfbf231432d052f03e7bd7c760c98"} Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.242365 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mmvgq" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.266966 4738 scope.go:117] "RemoveContainer" containerID="0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.273937 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l6n8\" (UniqueName: \"kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8\") pod \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.274102 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content\") pod \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.274172 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities\") pod \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\" (UID: \"7e052e6e-6b6d-47c5-a745-a1db1cab9669\") " Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.275053 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities" (OuterVolumeSpecName: "utilities") pod "7e052e6e-6b6d-47c5-a745-a1db1cab9669" (UID: "7e052e6e-6b6d-47c5-a745-a1db1cab9669"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.285455 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.293868 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k4rs8"] Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.294326 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8" (OuterVolumeSpecName: "kube-api-access-4l6n8") pod "7e052e6e-6b6d-47c5-a745-a1db1cab9669" (UID: "7e052e6e-6b6d-47c5-a745-a1db1cab9669"). InnerVolumeSpecName "kube-api-access-4l6n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.304522 4738 scope.go:117] "RemoveContainer" containerID="f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.319291 4738 scope.go:117] "RemoveContainer" containerID="e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.319922 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187\": container with ID starting with e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187 not found: ID does not exist" containerID="e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.319985 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187"} err="failed to get container status \"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187\": rpc error: code = NotFound desc = could not find container \"e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187\": container with ID starting with e5d1288a1b0db4fc89908bc506c8bdec4d3bfb0e7e63dda96ba404ca40cf1187 not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.320012 4738 scope.go:117] "RemoveContainer" containerID="0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.320372 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a\": container with ID starting with 0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a not found: ID does not exist" containerID="0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.320507 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a"} err="failed to get container status \"0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a\": rpc error: code = NotFound desc = could not find container \"0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a\": container with ID starting with 0d77899c7f5a833a04a2c2ecdc43262719564ff3308a0117893c22785fdcda6a not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.320635 4738 scope.go:117] "RemoveContainer" containerID="f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.321085 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11\": container with ID starting with f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11 not found: ID does not exist" containerID="f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.321128 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11"} err="failed to get container status \"f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11\": rpc error: code = NotFound desc = could not find container \"f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11\": container with ID starting with f45c01f78e4dab5d8ce7553c0629380d754b39fced51108596354afe67d0ac11 not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.321172 4738 scope.go:117] "RemoveContainer" containerID="6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.344987 4738 scope.go:117] "RemoveContainer" containerID="fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.359359 4738 scope.go:117] "RemoveContainer" containerID="c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.375499 4738 scope.go:117] "RemoveContainer" containerID="6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.376382 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.376464 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l6n8\" (UniqueName: \"kubernetes.io/projected/7e052e6e-6b6d-47c5-a745-a1db1cab9669-kube-api-access-4l6n8\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.376487 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff\": container with ID starting with 6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff not found: ID does not exist" containerID="6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.376645 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff"} err="failed to get container status \"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff\": rpc error: code = NotFound desc = could not find container \"6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff\": container with ID starting with 6fd91c5c30a47e7ead9459af3aafc70761954c05799241528d1f9e2d7d32c4ff not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.376741 4738 scope.go:117] "RemoveContainer" containerID="fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.389922 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90\": container with ID starting with fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90 not found: ID does not exist" containerID="fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.389970 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90"} err="failed to get container status \"fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90\": rpc error: code = NotFound desc = could not find container \"fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90\": container with ID starting with fa96f1e55128a6ea5e9755e1bd7ec4525e54889b9e5729e5b169da0d3ebfcb90 not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.390004 4738 scope.go:117] "RemoveContainer" containerID="c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057" Mar 07 07:05:10 crc kubenswrapper[4738]: E0307 07:05:10.390718 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057\": container with ID starting with c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057 not found: ID does not exist" containerID="c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.390757 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057"} err="failed to get container status \"c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057\": rpc error: code = NotFound desc = could not find container \"c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057\": container with ID starting with c6b9e63d45fa7436d0adb4c89f03af7658bf623bca109c22bf3a397481d22057 not found: ID does not exist" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.394095 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" path="/var/lib/kubelet/pods/ef082c9b-8cc2-4c38-8957-28912470b473/volumes" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.441517 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7e052e6e-6b6d-47c5-a745-a1db1cab9669" (UID: "7e052e6e-6b6d-47c5-a745-a1db1cab9669"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.478193 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7e052e6e-6b6d-47c5-a745-a1db1cab9669-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.582650 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:05:10 crc kubenswrapper[4738]: I0307 07:05:10.585523 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mmvgq"] Mar 07 07:05:11 crc kubenswrapper[4738]: I0307 07:05:11.919027 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:05:11 crc kubenswrapper[4738]: I0307 07:05:11.919473 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-67hw7" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="registry-server" containerID="cri-o://86bc2c43fd2ad7ad181510c495d17d8ba6838193a71fe35d7573c52eb2b3a81e" gracePeriod=2 Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.265881 4738 generic.go:334] "Generic (PLEG): container finished" podID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerID="86bc2c43fd2ad7ad181510c495d17d8ba6838193a71fe35d7573c52eb2b3a81e" exitCode=0 Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.265974 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerDied","Data":"86bc2c43fd2ad7ad181510c495d17d8ba6838193a71fe35d7573c52eb2b3a81e"} Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.394213 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" path="/var/lib/kubelet/pods/7e052e6e-6b6d-47c5-a745-a1db1cab9669/volumes" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.442613 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.509573 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content\") pod \"11b75931-7617-4db8-b9e0-62a32ccd6948\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.509649 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities\") pod \"11b75931-7617-4db8-b9e0-62a32ccd6948\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.509688 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvzg2\" (UniqueName: \"kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2\") pod \"11b75931-7617-4db8-b9e0-62a32ccd6948\" (UID: \"11b75931-7617-4db8-b9e0-62a32ccd6948\") " Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.510991 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities" (OuterVolumeSpecName: "utilities") pod "11b75931-7617-4db8-b9e0-62a32ccd6948" (UID: "11b75931-7617-4db8-b9e0-62a32ccd6948"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.515441 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2" (OuterVolumeSpecName: "kube-api-access-lvzg2") pod "11b75931-7617-4db8-b9e0-62a32ccd6948" (UID: "11b75931-7617-4db8-b9e0-62a32ccd6948"). InnerVolumeSpecName "kube-api-access-lvzg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.538706 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11b75931-7617-4db8-b9e0-62a32ccd6948" (UID: "11b75931-7617-4db8-b9e0-62a32ccd6948"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.610380 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.610411 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11b75931-7617-4db8-b9e0-62a32ccd6948-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:12 crc kubenswrapper[4738]: I0307 07:05:12.610420 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvzg2\" (UniqueName: \"kubernetes.io/projected/11b75931-7617-4db8-b9e0-62a32ccd6948-kube-api-access-lvzg2\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.288437 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-67hw7" event={"ID":"11b75931-7617-4db8-b9e0-62a32ccd6948","Type":"ContainerDied","Data":"29ef7d41857452915f61a273e1a2785ef0f0e6560e8fbfeb71c28e248765fd85"} Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.288528 4738 scope.go:117] "RemoveContainer" containerID="86bc2c43fd2ad7ad181510c495d17d8ba6838193a71fe35d7573c52eb2b3a81e" Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.288560 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-67hw7" Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.319870 4738 scope.go:117] "RemoveContainer" containerID="e20fa54a4e1b3410a9e5c3569544cd786b62cab5a2267cdbd99eab9a32b05438" Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.342921 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.346222 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-67hw7"] Mar 07 07:05:13 crc kubenswrapper[4738]: I0307 07:05:13.366248 4738 scope.go:117] "RemoveContainer" containerID="fe078cb41a33ede83d13f77e23e14f8f237234696ed5a881623a0c96c5277d19" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.033489 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.033833 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" podUID="83c9105f-ba15-4aa4-8310-cf1616dd359a" containerName="controller-manager" containerID="cri-o://a713eab4e3ae69458ebcf81e5a3b8741157d614608ead4947f69e82f8662a06f" gracePeriod=30 Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.124368 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.124938 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" podUID="1f1ba920-86ac-4a49-80e7-356fad40b91b" containerName="route-controller-manager" containerID="cri-o://5033f8785c0dea8d9cb20ae65fa2d765ccde48d713d62a92cae05668cccf5425" gracePeriod=30 Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.315344 4738 generic.go:334] "Generic (PLEG): container finished" podID="1f1ba920-86ac-4a49-80e7-356fad40b91b" containerID="5033f8785c0dea8d9cb20ae65fa2d765ccde48d713d62a92cae05668cccf5425" exitCode=0 Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.315429 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" event={"ID":"1f1ba920-86ac-4a49-80e7-356fad40b91b","Type":"ContainerDied","Data":"5033f8785c0dea8d9cb20ae65fa2d765ccde48d713d62a92cae05668cccf5425"} Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.324231 4738 generic.go:334] "Generic (PLEG): container finished" podID="83c9105f-ba15-4aa4-8310-cf1616dd359a" containerID="a713eab4e3ae69458ebcf81e5a3b8741157d614608ead4947f69e82f8662a06f" exitCode=0 Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.324282 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" event={"ID":"83c9105f-ba15-4aa4-8310-cf1616dd359a","Type":"ContainerDied","Data":"a713eab4e3ae69458ebcf81e5a3b8741157d614608ead4947f69e82f8662a06f"} Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.403670 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" path="/var/lib/kubelet/pods/11b75931-7617-4db8-b9e0-62a32ccd6948/volumes" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.621144 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.671571 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.762392 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config\") pod \"83c9105f-ba15-4aa4-8310-cf1616dd359a\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763000 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles\") pod \"83c9105f-ba15-4aa4-8310-cf1616dd359a\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763145 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5dzg\" (UniqueName: \"kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg\") pod \"83c9105f-ba15-4aa4-8310-cf1616dd359a\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763226 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca\") pod \"83c9105f-ba15-4aa4-8310-cf1616dd359a\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763290 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert\") pod \"83c9105f-ba15-4aa4-8310-cf1616dd359a\" (UID: \"83c9105f-ba15-4aa4-8310-cf1616dd359a\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763515 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config" (OuterVolumeSpecName: "config") pod "83c9105f-ba15-4aa4-8310-cf1616dd359a" (UID: "83c9105f-ba15-4aa4-8310-cf1616dd359a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763837 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763993 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca" (OuterVolumeSpecName: "client-ca") pod "83c9105f-ba15-4aa4-8310-cf1616dd359a" (UID: "83c9105f-ba15-4aa4-8310-cf1616dd359a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.763996 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "83c9105f-ba15-4aa4-8310-cf1616dd359a" (UID: "83c9105f-ba15-4aa4-8310-cf1616dd359a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.769890 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "83c9105f-ba15-4aa4-8310-cf1616dd359a" (UID: "83c9105f-ba15-4aa4-8310-cf1616dd359a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.769968 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg" (OuterVolumeSpecName: "kube-api-access-b5dzg") pod "83c9105f-ba15-4aa4-8310-cf1616dd359a" (UID: "83c9105f-ba15-4aa4-8310-cf1616dd359a"). InnerVolumeSpecName "kube-api-access-b5dzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.864878 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhtl\" (UniqueName: \"kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl\") pod \"1f1ba920-86ac-4a49-80e7-356fad40b91b\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.864956 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config\") pod \"1f1ba920-86ac-4a49-80e7-356fad40b91b\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865004 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca\") pod \"1f1ba920-86ac-4a49-80e7-356fad40b91b\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865071 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert\") pod \"1f1ba920-86ac-4a49-80e7-356fad40b91b\" (UID: \"1f1ba920-86ac-4a49-80e7-356fad40b91b\") " Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865453 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5dzg\" (UniqueName: \"kubernetes.io/projected/83c9105f-ba15-4aa4-8310-cf1616dd359a-kube-api-access-b5dzg\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865475 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865487 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83c9105f-ba15-4aa4-8310-cf1616dd359a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.865500 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83c9105f-ba15-4aa4-8310-cf1616dd359a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.866459 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f1ba920-86ac-4a49-80e7-356fad40b91b" (UID: "1f1ba920-86ac-4a49-80e7-356fad40b91b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.866656 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config" (OuterVolumeSpecName: "config") pod "1f1ba920-86ac-4a49-80e7-356fad40b91b" (UID: "1f1ba920-86ac-4a49-80e7-356fad40b91b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.868935 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f1ba920-86ac-4a49-80e7-356fad40b91b" (UID: "1f1ba920-86ac-4a49-80e7-356fad40b91b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.869507 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl" (OuterVolumeSpecName: "kube-api-access-6qhtl") pod "1f1ba920-86ac-4a49-80e7-356fad40b91b" (UID: "1f1ba920-86ac-4a49-80e7-356fad40b91b"). InnerVolumeSpecName "kube-api-access-6qhtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.966641 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qhtl\" (UniqueName: \"kubernetes.io/projected/1f1ba920-86ac-4a49-80e7-356fad40b91b-kube-api-access-6qhtl\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.966674 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.966685 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f1ba920-86ac-4a49-80e7-356fad40b91b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:14 crc kubenswrapper[4738]: I0307 07:05:14.966695 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f1ba920-86ac-4a49-80e7-356fad40b91b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.341569 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.341540 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-df74875fc-75rml" event={"ID":"83c9105f-ba15-4aa4-8310-cf1616dd359a","Type":"ContainerDied","Data":"1a1f5195758e9b34de82fe47b0aefff5a89c972037b39dcad0342e31d8285d37"} Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.341817 4738 scope.go:117] "RemoveContainer" containerID="a713eab4e3ae69458ebcf81e5a3b8741157d614608ead4947f69e82f8662a06f" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.344619 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" event={"ID":"1f1ba920-86ac-4a49-80e7-356fad40b91b","Type":"ContainerDied","Data":"090b57359839aacf04bedab27a880b2ae8b35050fe9b235b566ae8c536b9eac7"} Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.344721 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.367823 4738 scope.go:117] "RemoveContainer" containerID="5033f8785c0dea8d9cb20ae65fa2d765ccde48d713d62a92cae05668cccf5425" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.388344 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.392933 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-748655f4c9-mnc6l"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.402297 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.406282 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-df74875fc-75rml"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685230 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg"] Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685592 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685610 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685624 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685632 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685642 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685654 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685670 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685679 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685689 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83c9105f-ba15-4aa4-8310-cf1616dd359a" containerName="controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685697 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="83c9105f-ba15-4aa4-8310-cf1616dd359a" containerName="controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685710 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685718 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685725 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685735 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685746 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685758 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685771 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f1ba920-86ac-4a49-80e7-356fad40b91b" containerName="route-controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685779 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f1ba920-86ac-4a49-80e7-356fad40b91b" containerName="route-controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685789 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685796 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685808 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685815 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685825 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685835 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685848 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685858 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="extract-content" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685870 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685881 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685893 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685901 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: E0307 07:05:15.685913 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.685922 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="extract-utilities" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686049 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="11b75931-7617-4db8-b9e0-62a32ccd6948" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686064 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686073 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" containerName="oc" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686084 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef082c9b-8cc2-4c38-8957-28912470b473" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686095 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d4ca81-b9b6-4d90-b7c3-b08e4fa3c7f0" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686105 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="83c9105f-ba15-4aa4-8310-cf1616dd359a" containerName="controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686120 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f1ba920-86ac-4a49-80e7-356fad40b91b" containerName="route-controller-manager" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686133 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e052e6e-6b6d-47c5-a745-a1db1cab9669" containerName="registry-server" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.686794 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.689015 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.689861 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.689936 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.690066 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.690136 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.690147 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.698909 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.701802 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.703947 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.703954 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.703977 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.703984 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.704203 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.704274 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.709066 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.709962 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.710480 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779399 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779451 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779471 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779513 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779666 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779705 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779730 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmzcm\" (UniqueName: \"kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779750 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.779938 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdbt7\" (UniqueName: \"kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.880920 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881007 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881035 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881089 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881120 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881147 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmzcm\" (UniqueName: \"kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881193 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881214 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.881251 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdbt7\" (UniqueName: \"kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.883813 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.883888 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.883933 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.883961 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.884205 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.888100 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.895938 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.907479 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmzcm\" (UniqueName: \"kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm\") pod \"controller-manager-6cd6c649c-mgwvg\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:15 crc kubenswrapper[4738]: I0307 07:05:15.909357 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdbt7\" (UniqueName: \"kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7\") pod \"route-controller-manager-557d8dc587-lsnb8\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.002881 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.009545 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.257242 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.285629 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg"] Mar 07 07:05:16 crc kubenswrapper[4738]: W0307 07:05:16.314674 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ef65b2a_78bf_4efc_9adf_84472dbddb49.slice/crio-1064a169f3bd87a40e921742b3221b4d3bc62a3c483abdd026de8a1930089bf0 WatchSource:0}: Error finding container 1064a169f3bd87a40e921742b3221b4d3bc62a3c483abdd026de8a1930089bf0: Status 404 returned error can't find the container with id 1064a169f3bd87a40e921742b3221b4d3bc62a3c483abdd026de8a1930089bf0 Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.355205 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" event={"ID":"5ef65b2a-78bf-4efc-9adf-84472dbddb49","Type":"ContainerStarted","Data":"1064a169f3bd87a40e921742b3221b4d3bc62a3c483abdd026de8a1930089bf0"} Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.356390 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" event={"ID":"d7bb0bc1-7992-4779-9ed5-23109e61365a","Type":"ContainerStarted","Data":"a578914dc7ea677bd84d00b6a3a6e5b514da27a4dc176831eccd0e5db48c691d"} Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.399139 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f1ba920-86ac-4a49-80e7-356fad40b91b" path="/var/lib/kubelet/pods/1f1ba920-86ac-4a49-80e7-356fad40b91b/volumes" Mar 07 07:05:16 crc kubenswrapper[4738]: I0307 07:05:16.399762 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83c9105f-ba15-4aa4-8310-cf1616dd359a" path="/var/lib/kubelet/pods/83c9105f-ba15-4aa4-8310-cf1616dd359a/volumes" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.368734 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" event={"ID":"5ef65b2a-78bf-4efc-9adf-84472dbddb49","Type":"ContainerStarted","Data":"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643"} Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.369279 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.371378 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" event={"ID":"d7bb0bc1-7992-4779-9ed5-23109e61365a","Type":"ContainerStarted","Data":"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d"} Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.371966 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.376491 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.380082 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.442348 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" podStartSLOduration=3.442330647 podStartE2EDuration="3.442330647s" podCreationTimestamp="2026-03-07 07:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:05:17.438322728 +0000 UTC m=+335.903310039" watchObservedRunningTime="2026-03-07 07:05:17.442330647 +0000 UTC m=+335.907317968" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.475557 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" podStartSLOduration=3.4755348010000002 podStartE2EDuration="3.475534801s" podCreationTimestamp="2026-03-07 07:05:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:05:17.474555174 +0000 UTC m=+335.939542495" watchObservedRunningTime="2026-03-07 07:05:17.475534801 +0000 UTC m=+335.940522122" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.668801 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:05:17 crc kubenswrapper[4738]: I0307 07:05:17.713363 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.066078 4738 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.067126 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.070781 4738 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.071372 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114" gracePeriod=15 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.071667 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c" gracePeriod=15 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.071796 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9" gracePeriod=15 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.071867 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb" gracePeriod=15 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.072005 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf" gracePeriod=15 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.072857 4738 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073292 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073323 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073344 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073356 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073381 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073397 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073423 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073444 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073466 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073479 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073499 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073511 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073524 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073538 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073553 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073565 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.073587 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073599 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073799 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073816 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073831 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073848 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073867 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073885 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.073898 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.074079 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.074096 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.075671 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.075700 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.119794 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144101 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144318 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144394 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144562 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144624 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144711 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144764 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.144870 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.245890 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246361 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246082 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246393 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246419 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246468 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246525 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246564 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246527 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246574 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246622 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246761 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246795 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246628 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.246623 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.386841 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.388532 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.389303 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c" exitCode=0 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.389338 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb" exitCode=0 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.389353 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9" exitCode=0 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.389365 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf" exitCode=2 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.389440 4738 scope.go:117] "RemoveContainer" containerID="1438a7b46d355d2ecc50c3bf09898395e33ce17f32c229f13c20a8a93facb233" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.392540 4738 generic.go:334] "Generic (PLEG): container finished" podID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" containerID="f65517aa8bce3e2b0b5daf26c3cabec7932e713546fd6b2c4c1b0833f5188e14" exitCode=0 Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.393510 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03158b9d-6cc1-4f5d-af1f-d21de41d536d","Type":"ContainerDied","Data":"f65517aa8bce3e2b0b5daf26c3cabec7932e713546fd6b2c4c1b0833f5188e14"} Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.394319 4738 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.394605 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.395017 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:19 crc kubenswrapper[4738]: I0307 07:05:19.414382 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:05:19 crc kubenswrapper[4738]: W0307 07:05:19.446517 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-e829142ef99493443a2de68858b5e318df5bc64c3fbb8f484ee41274c63f7f84 WatchSource:0}: Error finding container e829142ef99493443a2de68858b5e318df5bc64c3fbb8f484ee41274c63f7f84: Status 404 returned error can't find the container with id e829142ef99493443a2de68858b5e318df5bc64c3fbb8f484ee41274c63f7f84 Mar 07 07:05:19 crc kubenswrapper[4738]: E0307 07:05:19.469280 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a7d4089e09a95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,LastTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:05:20 crc kubenswrapper[4738]: E0307 07:05:20.312102 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a7d4089e09a95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,LastTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.412734 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f"} Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.412803 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"e829142ef99493443a2de68858b5e318df5bc64c3fbb8f484ee41274c63f7f84"} Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.414321 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.415460 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.420322 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.700841 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" containerID="cri-o://3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524" gracePeriod=15 Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.806458 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.807147 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.807486 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.873782 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir\") pod \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.873845 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock\") pod \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.873970 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access\") pod \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\" (UID: \"03158b9d-6cc1-4f5d-af1f-d21de41d536d\") " Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.874297 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "03158b9d-6cc1-4f5d-af1f-d21de41d536d" (UID: "03158b9d-6cc1-4f5d-af1f-d21de41d536d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.874320 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock" (OuterVolumeSpecName: "var-lock") pod "03158b9d-6cc1-4f5d-af1f-d21de41d536d" (UID: "03158b9d-6cc1-4f5d-af1f-d21de41d536d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.874517 4738 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.874543 4738 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/03158b9d-6cc1-4f5d-af1f-d21de41d536d-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.885634 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "03158b9d-6cc1-4f5d-af1f-d21de41d536d" (UID: "03158b9d-6cc1-4f5d-af1f-d21de41d536d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:20 crc kubenswrapper[4738]: I0307 07:05:20.975481 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/03158b9d-6cc1-4f5d-af1f-d21de41d536d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.225119 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.228059 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.228579 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.228930 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288542 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288597 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtqlw\" (UniqueName: \"kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288616 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288636 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288665 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288684 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288709 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288732 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288820 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288866 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288904 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288932 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288956 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.288978 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir\") pod \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\" (UID: \"cd2b6e69-054d-461b-8a1d-ca38261a83d3\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.289306 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.290550 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.292684 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.292779 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.293396 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.296946 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.297858 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.297864 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.299817 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw" (OuterVolumeSpecName: "kube-api-access-gtqlw") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "kube-api-access-gtqlw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.300686 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.301168 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.302767 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.305535 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.309057 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cd2b6e69-054d-461b-8a1d-ca38261a83d3" (UID: "cd2b6e69-054d-461b-8a1d-ca38261a83d3"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.390954 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.391917 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtqlw\" (UniqueName: \"kubernetes.io/projected/cd2b6e69-054d-461b-8a1d-ca38261a83d3-kube-api-access-gtqlw\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.391990 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392091 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392178 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392268 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392338 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392412 4738 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392473 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392551 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392629 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392692 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392752 4738 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cd2b6e69-054d-461b-8a1d-ca38261a83d3-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.392821 4738 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cd2b6e69-054d-461b-8a1d-ca38261a83d3-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.429496 4738 generic.go:334] "Generic (PLEG): container finished" podID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerID="3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524" exitCode=0 Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.429602 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" event={"ID":"cd2b6e69-054d-461b-8a1d-ca38261a83d3","Type":"ContainerDied","Data":"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524"} Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.429666 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" event={"ID":"cd2b6e69-054d-461b-8a1d-ca38261a83d3","Type":"ContainerDied","Data":"c207b0c05bee3628dda5debb82d9227fe849e435dcdb820d142fdfc3d847f9b6"} Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.429692 4738 scope.go:117] "RemoveContainer" containerID="3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.430010 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.431588 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.432449 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.432439 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"03158b9d-6cc1-4f5d-af1f-d21de41d536d","Type":"ContainerDied","Data":"b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33"} Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.432516 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b77325415d298254b7fe5da429dc386f7a71cad961c9382804f3dbe3963e2b33" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.434604 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.435205 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: E0307 07:05:21.485320 4738 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" volumeName="registry-storage" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.502271 4738 scope.go:117] "RemoveContainer" containerID="3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524" Mar 07 07:05:21 crc kubenswrapper[4738]: E0307 07:05:21.503705 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524\": container with ID starting with 3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524 not found: ID does not exist" containerID="3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.503906 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524"} err="failed to get container status \"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524\": rpc error: code = NotFound desc = could not find container \"3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524\": container with ID starting with 3561ba94a3ae92c42636eef0b14f64b33a1ee4c46d2535ab76cd541e58c18524 not found: ID does not exist" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.541073 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.541927 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.542196 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.543098 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.543469 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.543956 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.551102 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.552320 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.553648 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.554020 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.554416 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.555020 4738 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.595497 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.595649 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.595681 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.596035 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.596064 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.596079 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.697520 4738 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.697561 4738 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:21 crc kubenswrapper[4738]: I0307 07:05:21.697571 4738 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.388828 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.389201 4738 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.389648 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.390144 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.394650 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.461623 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.463189 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114" exitCode=0 Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.463313 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.463346 4738 scope.go:117] "RemoveContainer" containerID="09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.464550 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.465059 4738 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.465543 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.466002 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.469369 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.469994 4738 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.471377 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.471846 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.479977 4738 scope.go:117] "RemoveContainer" containerID="f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.503090 4738 scope.go:117] "RemoveContainer" containerID="4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.522469 4738 scope.go:117] "RemoveContainer" containerID="200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.538332 4738 scope.go:117] "RemoveContainer" containerID="27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.557295 4738 scope.go:117] "RemoveContainer" containerID="0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.579979 4738 scope.go:117] "RemoveContainer" containerID="09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.580969 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\": container with ID starting with 09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c not found: ID does not exist" containerID="09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.581043 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c"} err="failed to get container status \"09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\": rpc error: code = NotFound desc = could not find container \"09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c\": container with ID starting with 09c600b9d3bb44d86b62e8b9788c73dbacf167a63fed59f63f7cf0a47f03aa9c not found: ID does not exist" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.581096 4738 scope.go:117] "RemoveContainer" containerID="f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.581500 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\": container with ID starting with f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb not found: ID does not exist" containerID="f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.581555 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb"} err="failed to get container status \"f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\": rpc error: code = NotFound desc = could not find container \"f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb\": container with ID starting with f747f87668ea7293f4f10af29fde8c09764488fd61dd7f7bb02597ad879642cb not found: ID does not exist" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.581593 4738 scope.go:117] "RemoveContainer" containerID="4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.582459 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\": container with ID starting with 4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9 not found: ID does not exist" containerID="4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.582495 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9"} err="failed to get container status \"4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\": rpc error: code = NotFound desc = could not find container \"4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9\": container with ID starting with 4465a288e842229e7d2d9ba2edb1f39ebc2a904353b5a466f0bfbc966fa1b6a9 not found: ID does not exist" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.582516 4738 scope.go:117] "RemoveContainer" containerID="200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.582835 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\": container with ID starting with 200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf not found: ID does not exist" containerID="200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.582868 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf"} err="failed to get container status \"200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\": rpc error: code = NotFound desc = could not find container \"200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf\": container with ID starting with 200ed62775775bc0222e503c8b2478f7360b0c1103d6c0b7cf4afc4690640adf not found: ID does not exist" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.582889 4738 scope.go:117] "RemoveContainer" containerID="27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.583533 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\": container with ID starting with 27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114 not found: ID does not exist" containerID="27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.583601 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114"} err="failed to get container status \"27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\": rpc error: code = NotFound desc = could not find container \"27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114\": container with ID starting with 27a4d8cff90bb0149be16fbe053e2d6ae846f04f0a379d17a29f9f5039ebb114 not found: ID does not exist" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.583631 4738 scope.go:117] "RemoveContainer" containerID="0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699" Mar 07 07:05:22 crc kubenswrapper[4738]: E0307 07:05:22.583999 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\": container with ID starting with 0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699 not found: ID does not exist" containerID="0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699" Mar 07 07:05:22 crc kubenswrapper[4738]: I0307 07:05:22.584034 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699"} err="failed to get container status \"0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\": rpc error: code = NotFound desc = could not find container \"0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699\": container with ID starting with 0a98592a14b94c66a5179db88fe8a3b3abbfc48e1f974a4a9f11d06748bda699 not found: ID does not exist" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.264459 4738 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.265323 4738 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.266114 4738 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.266620 4738 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.267086 4738 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:23 crc kubenswrapper[4738]: I0307 07:05:23.267126 4738 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.267510 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="200ms" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.468831 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="400ms" Mar 07 07:05:23 crc kubenswrapper[4738]: E0307 07:05:23.870439 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="800ms" Mar 07 07:05:24 crc kubenswrapper[4738]: E0307 07:05:24.672203 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="1.6s" Mar 07 07:05:26 crc kubenswrapper[4738]: E0307 07:05:26.273303 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="3.2s" Mar 07 07:05:27 crc kubenswrapper[4738]: I0307 07:05:27.678485 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:05:27 crc kubenswrapper[4738]: I0307 07:05:27.678559 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:05:27 crc kubenswrapper[4738]: W0307 07:05:27.679231 4738 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27471": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:27 crc kubenswrapper[4738]: W0307 07:05:27.679235 4738 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:27 crc kubenswrapper[4738]: E0307 07:05:27.679316 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:27 crc kubenswrapper[4738]: E0307 07:05:27.679314 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27471\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:27 crc kubenswrapper[4738]: I0307 07:05:27.779619 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:05:27 crc kubenswrapper[4738]: I0307 07:05:27.779751 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:05:27 crc kubenswrapper[4738]: W0307 07:05:27.780778 4738 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:27 crc kubenswrapper[4738]: E0307 07:05:27.780864 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.678717 4738 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.678819 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:07:30.678798023 +0000 UTC m=+469.143785334 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.679399 4738 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.679552 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-07 07:07:30.679524062 +0000 UTC m=+469.144511413 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.780712 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.780800 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:28 crc kubenswrapper[4738]: W0307 07:05:28.781326 4738 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:28 crc kubenswrapper[4738]: E0307 07:05:28.781464 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.474945 4738 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.51:6443: connect: connection refused" interval="6.4s" Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781419 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781452 4738 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781502 4738 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781597 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-07 07:07:31.781568882 +0000 UTC m=+470.246556243 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781468 4738 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.781676 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-07 07:07:31.781663474 +0000 UTC m=+470.246650825 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 07 07:05:29 crc kubenswrapper[4738]: W0307 07:05:29.836536 4738 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:29 crc kubenswrapper[4738]: E0307 07:05:29.836668 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:30 crc kubenswrapper[4738]: E0307 07:05:30.315501 4738 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.51:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189a7d4089e09a95 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,LastTimestamp:2026-03-07 07:05:19.468288661 +0000 UTC m=+337.933275992,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.385640 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.386764 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.387092 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.387299 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.403119 4738 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.403147 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:30 crc kubenswrapper[4738]: E0307 07:05:30.404053 4738 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.405053 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:30 crc kubenswrapper[4738]: W0307 07:05:30.410097 4738 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:30 crc kubenswrapper[4738]: E0307 07:05:30.410194 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:30 crc kubenswrapper[4738]: W0307 07:05:30.429073 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-9cf82e51144a6e036d1e60816573ea69bcde8fd6f7c06ddfae6c4e1189e314af WatchSource:0}: Error finding container 9cf82e51144a6e036d1e60816573ea69bcde8fd6f7c06ddfae6c4e1189e314af: Status 404 returned error can't find the container with id 9cf82e51144a6e036d1e60816573ea69bcde8fd6f7c06ddfae6c4e1189e314af Mar 07 07:05:30 crc kubenswrapper[4738]: I0307 07:05:30.531688 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9cf82e51144a6e036d1e60816573ea69bcde8fd6f7c06ddfae6c4e1189e314af"} Mar 07 07:05:30 crc kubenswrapper[4738]: W0307 07:05:30.555698 4738 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27471": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:30 crc kubenswrapper[4738]: E0307 07:05:30.555801 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27471\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:30 crc kubenswrapper[4738]: W0307 07:05:30.636985 4738 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27304": dial tcp 38.102.83.51:6443: connect: connection refused Mar 07 07:05:30 crc kubenswrapper[4738]: E0307 07:05:30.637086 4738 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27304\": dial tcp 38.102.83.51:6443: connect: connection refused" logger="UnhandledError" Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.539817 4738 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="728f7915019f50e1e4f1623ab83ae2c12ee83c5dfc026edc671d6d740bab4ee1" exitCode=0 Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.539926 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"728f7915019f50e1e4f1623ab83ae2c12ee83c5dfc026edc671d6d740bab4ee1"} Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.540709 4738 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.540749 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.540779 4738 status_manager.go:851] "Failed to get status for pod" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" pod="openshift-authentication/oauth-openshift-558db77b4-7vtnd" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-7vtnd\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:31 crc kubenswrapper[4738]: E0307 07:05:31.541406 4738 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.541450 4738 status_manager.go:851] "Failed to get status for pod" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:31 crc kubenswrapper[4738]: I0307 07:05:31.541954 4738 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.51:6443: connect: connection refused" Mar 07 07:05:32 crc kubenswrapper[4738]: I0307 07:05:32.553417 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"508879ed5cf582992254822982e65ed828ca226535485c3a35f5f59214d90a75"} Mar 07 07:05:32 crc kubenswrapper[4738]: I0307 07:05:32.553462 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c1c78f7bf90638c21aa32a27aa32234ac1b20b48158c925ed9fa035d0501052c"} Mar 07 07:05:32 crc kubenswrapper[4738]: I0307 07:05:32.553477 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4c286d60ad02e947735f51aeecdf3b5c21f2ecd650e50b373a7d80cfd5126c23"} Mar 07 07:05:33 crc kubenswrapper[4738]: I0307 07:05:33.561335 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"63ae8e837c3a2f50185fe994473b0ec93f1ef1c1653f4fda18a300f54e065a39"} Mar 07 07:05:33 crc kubenswrapper[4738]: I0307 07:05:33.561673 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"df98e7ecc6e868ddcd28b75dd0a91b791c23ed296319dcffd5e39c7b418ad369"} Mar 07 07:05:33 crc kubenswrapper[4738]: I0307 07:05:33.561695 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:33 crc kubenswrapper[4738]: I0307 07:05:33.561695 4738 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:33 crc kubenswrapper[4738]: I0307 07:05:33.561731 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.568335 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.569727 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.569773 4738 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985" exitCode=1 Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.569802 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985"} Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.570137 4738 scope.go:117] "RemoveContainer" containerID="f2341ca8dd36ee4163c9c76be0e4ac1181a2f1e0cce11e8682770e2086661985" Mar 07 07:05:34 crc kubenswrapper[4738]: I0307 07:05:34.974666 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.204940 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.222051 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.405500 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.405558 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.415925 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.581986 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.583660 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.583761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e2b19d33fc2798f36586101ab1d6e99664252a41fa5bd3f656251987f2f9cc29"} Mar 07 07:05:35 crc kubenswrapper[4738]: I0307 07:05:35.939973 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 07:05:38 crc kubenswrapper[4738]: I0307 07:05:38.574433 4738 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:38 crc kubenswrapper[4738]: I0307 07:05:38.602728 4738 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:38 crc kubenswrapper[4738]: I0307 07:05:38.602770 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:38 crc kubenswrapper[4738]: I0307 07:05:38.607402 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:05:38 crc kubenswrapper[4738]: I0307 07:05:38.611254 4738 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1d5b2b97-be88-455f-b11a-4729e0f68368" Mar 07 07:05:39 crc kubenswrapper[4738]: I0307 07:05:39.609564 4738 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:39 crc kubenswrapper[4738]: I0307 07:05:39.609636 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="3b9b5260-782a-4f85-8709-5ba6857b1340" Mar 07 07:05:39 crc kubenswrapper[4738]: I0307 07:05:39.633670 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 07:05:41 crc kubenswrapper[4738]: I0307 07:05:41.957569 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:05:41 crc kubenswrapper[4738]: I0307 07:05:41.962941 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:05:42 crc kubenswrapper[4738]: I0307 07:05:42.405385 4738 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="1d5b2b97-be88-455f-b11a-4729e0f68368" Mar 07 07:05:42 crc kubenswrapper[4738]: I0307 07:05:42.627764 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:05:44 crc kubenswrapper[4738]: I0307 07:05:44.980081 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 07 07:05:48 crc kubenswrapper[4738]: E0307 07:05:48.414540 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 07 07:05:48 crc kubenswrapper[4738]: E0307 07:05:48.424526 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 07 07:05:48 crc kubenswrapper[4738]: E0307 07:05:48.430731 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 07 07:05:48 crc kubenswrapper[4738]: I0307 07:05:48.821287 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 07:05:49 crc kubenswrapper[4738]: I0307 07:05:49.176091 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 07:05:49 crc kubenswrapper[4738]: I0307 07:05:49.524614 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 07:05:49 crc kubenswrapper[4738]: I0307 07:05:49.600016 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.217671 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.437870 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.490398 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.590098 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.752098 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 07:05:50 crc kubenswrapper[4738]: I0307 07:05:50.811483 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.044274 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.208034 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.209863 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.228610 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.254372 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.279069 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.453981 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.535224 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.611025 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.851716 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 07:05:51 crc kubenswrapper[4738]: I0307 07:05:51.982944 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 07 07:05:52 crc kubenswrapper[4738]: I0307 07:05:52.095688 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 07:05:52 crc kubenswrapper[4738]: I0307 07:05:52.353510 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 07 07:05:52 crc kubenswrapper[4738]: I0307 07:05:52.605037 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:05:52 crc kubenswrapper[4738]: I0307 07:05:52.909666 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.007743 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.117873 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.152539 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.186612 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.221858 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.286429 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.336923 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.352460 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.674419 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.721067 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 07:05:53 crc kubenswrapper[4738]: I0307 07:05:53.959393 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.176669 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.181880 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.248200 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.266655 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.358554 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.459184 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.561348 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.604097 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.698056 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.730869 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 07:05:54 crc kubenswrapper[4738]: I0307 07:05:54.908654 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.025804 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.094586 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.095414 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.104079 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.104275 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.201189 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.255784 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.322991 4738 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.419924 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.490066 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.498207 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.597071 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.656976 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.772791 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.795701 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.830665 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 07:05:55 crc kubenswrapper[4738]: I0307 07:05:55.836638 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.026242 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.074131 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.077030 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.164091 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.201987 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.226884 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.256043 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.321730 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.522247 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.575200 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.602496 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.604135 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.638842 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.720403 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.768056 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.886380 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 07:05:56 crc kubenswrapper[4738]: I0307 07:05:56.985558 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.015803 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.049916 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.139348 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.285828 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.328519 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.407262 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.486612 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.590415 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.598717 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.639450 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.655523 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.657277 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.706328 4738 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.957903 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 07:05:57 crc kubenswrapper[4738]: I0307 07:05:57.985917 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.014017 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.082808 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.145351 4738 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.153558 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.182359 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.266588 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.291289 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.297619 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.321824 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.362381 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.382676 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.536949 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.540805 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.551929 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.578573 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.589991 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.597621 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.744324 4738 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.746410 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=39.74639696 podStartE2EDuration="39.74639696s" podCreationTimestamp="2026-03-07 07:05:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:05:38.512910323 +0000 UTC m=+356.977897654" watchObservedRunningTime="2026-03-07 07:05:58.74639696 +0000 UTC m=+377.211384281" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.748364 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-7vtnd"] Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.748413 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.748429 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg","openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.748589 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" podUID="d7bb0bc1-7992-4779-9ed5-23109e61365a" containerName="route-controller-manager" containerID="cri-o://e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d" gracePeriod=30 Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.748738 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" podUID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" containerName="controller-manager" containerID="cri-o://8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643" gracePeriod=30 Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.752028 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.778851 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=20.778804832 podStartE2EDuration="20.778804832s" podCreationTimestamp="2026-03-07 07:05:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:05:58.775431872 +0000 UTC m=+377.240419203" watchObservedRunningTime="2026-03-07 07:05:58.778804832 +0000 UTC m=+377.243792143" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.826769 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.837612 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 07 07:05:58 crc kubenswrapper[4738]: I0307 07:05:58.940780 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.139351 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.208639 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.271793 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.284394 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.391009 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434413 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert\") pod \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434517 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config\") pod \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434544 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdbt7\" (UniqueName: \"kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7\") pod \"d7bb0bc1-7992-4779-9ed5-23109e61365a\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434777 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca\") pod \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434831 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmzcm\" (UniqueName: \"kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm\") pod \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434863 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles\") pod \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\" (UID: \"5ef65b2a-78bf-4efc-9adf-84472dbddb49\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434915 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert\") pod \"d7bb0bc1-7992-4779-9ed5-23109e61365a\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434953 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca\") pod \"d7bb0bc1-7992-4779-9ed5-23109e61365a\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.434984 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config\") pod \"d7bb0bc1-7992-4779-9ed5-23109e61365a\" (UID: \"d7bb0bc1-7992-4779-9ed5-23109e61365a\") " Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.435793 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config" (OuterVolumeSpecName: "config") pod "5ef65b2a-78bf-4efc-9adf-84472dbddb49" (UID: "5ef65b2a-78bf-4efc-9adf-84472dbddb49"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.435771 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca" (OuterVolumeSpecName: "client-ca") pod "5ef65b2a-78bf-4efc-9adf-84472dbddb49" (UID: "5ef65b2a-78bf-4efc-9adf-84472dbddb49"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.435879 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5ef65b2a-78bf-4efc-9adf-84472dbddb49" (UID: "5ef65b2a-78bf-4efc-9adf-84472dbddb49"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.436020 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca" (OuterVolumeSpecName: "client-ca") pod "d7bb0bc1-7992-4779-9ed5-23109e61365a" (UID: "d7bb0bc1-7992-4779-9ed5-23109e61365a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.436204 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config" (OuterVolumeSpecName: "config") pod "d7bb0bc1-7992-4779-9ed5-23109e61365a" (UID: "d7bb0bc1-7992-4779-9ed5-23109e61365a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.441759 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm" (OuterVolumeSpecName: "kube-api-access-lmzcm") pod "5ef65b2a-78bf-4efc-9adf-84472dbddb49" (UID: "5ef65b2a-78bf-4efc-9adf-84472dbddb49"). InnerVolumeSpecName "kube-api-access-lmzcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.442101 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5ef65b2a-78bf-4efc-9adf-84472dbddb49" (UID: "5ef65b2a-78bf-4efc-9adf-84472dbddb49"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.442270 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7" (OuterVolumeSpecName: "kube-api-access-bdbt7") pod "d7bb0bc1-7992-4779-9ed5-23109e61365a" (UID: "d7bb0bc1-7992-4779-9ed5-23109e61365a"). InnerVolumeSpecName "kube-api-access-bdbt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.442312 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d7bb0bc1-7992-4779-9ed5-23109e61365a" (UID: "d7bb0bc1-7992-4779-9ed5-23109e61365a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.469447 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.525891 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.536933 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d7bb0bc1-7992-4779-9ed5-23109e61365a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537002 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537013 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7bb0bc1-7992-4779-9ed5-23109e61365a-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537031 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ef65b2a-78bf-4efc-9adf-84472dbddb49-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537043 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537056 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdbt7\" (UniqueName: \"kubernetes.io/projected/d7bb0bc1-7992-4779-9ed5-23109e61365a-kube-api-access-bdbt7\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537069 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537082 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmzcm\" (UniqueName: \"kubernetes.io/projected/5ef65b2a-78bf-4efc-9adf-84472dbddb49-kube-api-access-lmzcm\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.537092 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5ef65b2a-78bf-4efc-9adf-84472dbddb49-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.577426 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.591311 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.624543 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.658052 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.716046 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.730458 4738 generic.go:334] "Generic (PLEG): container finished" podID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" containerID="8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643" exitCode=0 Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.730521 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.730539 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" event={"ID":"5ef65b2a-78bf-4efc-9adf-84472dbddb49","Type":"ContainerDied","Data":"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643"} Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.730573 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cd6c649c-mgwvg" event={"ID":"5ef65b2a-78bf-4efc-9adf-84472dbddb49","Type":"ContainerDied","Data":"1064a169f3bd87a40e921742b3221b4d3bc62a3c483abdd026de8a1930089bf0"} Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.730597 4738 scope.go:117] "RemoveContainer" containerID="8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.732676 4738 generic.go:334] "Generic (PLEG): container finished" podID="d7bb0bc1-7992-4779-9ed5-23109e61365a" containerID="e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d" exitCode=0 Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.732707 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" event={"ID":"d7bb0bc1-7992-4779-9ed5-23109e61365a","Type":"ContainerDied","Data":"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d"} Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.732726 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" event={"ID":"d7bb0bc1-7992-4779-9ed5-23109e61365a","Type":"ContainerDied","Data":"a578914dc7ea677bd84d00b6a3a6e5b514da27a4dc176831eccd0e5db48c691d"} Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.732776 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.753485 4738 scope.go:117] "RemoveContainer" containerID="8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643" Mar 07 07:05:59 crc kubenswrapper[4738]: E0307 07:05:59.754180 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643\": container with ID starting with 8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643 not found: ID does not exist" containerID="8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.754222 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643"} err="failed to get container status \"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643\": rpc error: code = NotFound desc = could not find container \"8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643\": container with ID starting with 8288323cb76757fcccf06415746798e4960c48b59bdd35eee2a7b48a56892643 not found: ID does not exist" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.754248 4738 scope.go:117] "RemoveContainer" containerID="e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.767181 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.771889 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.777607 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-557d8dc587-lsnb8"] Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.782394 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg"] Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.783407 4738 scope.go:117] "RemoveContainer" containerID="e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d" Mar 07 07:05:59 crc kubenswrapper[4738]: E0307 07:05:59.783856 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d\": container with ID starting with e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d not found: ID does not exist" containerID="e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.783895 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d"} err="failed to get container status \"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d\": rpc error: code = NotFound desc = could not find container \"e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d\": container with ID starting with e294130d32381a62f724b2e4b22446c39b45e82fb88b531bb3e16922c689a70d not found: ID does not exist" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.786457 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6cd6c649c-mgwvg"] Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.822253 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 07:05:59 crc kubenswrapper[4738]: I0307 07:05:59.848507 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.041025 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.148600 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.175468 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.195998 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.199740 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.241047 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.252230 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.268785 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.295954 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.320734 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.368257 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.402372 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" path="/var/lib/kubelet/pods/5ef65b2a-78bf-4efc-9adf-84472dbddb49/volumes" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.402999 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" path="/var/lib/kubelet/pods/cd2b6e69-054d-461b-8a1d-ca38261a83d3/volumes" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.403918 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7bb0bc1-7992-4779-9ed5-23109e61365a" path="/var/lib/kubelet/pods/d7bb0bc1-7992-4779-9ed5-23109e61365a/volumes" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.414511 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.455235 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.487106 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.689740 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.710658 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.727331 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.788488 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.803833 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.844947 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.852967 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.955418 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.956082 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 07:06:00 crc kubenswrapper[4738]: I0307 07:06:00.996216 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.019134 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.024088 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.071070 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.074321 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.148545 4738 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.149289 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f" gracePeriod=5 Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.177005 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.192609 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.211558 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.218930 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.259211 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.280283 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.294529 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.343812 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.397125 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.404726 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547786-66p2d"] Mar 07 07:06:01 crc kubenswrapper[4738]: E0307 07:06:01.404955 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" containerName="installer" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.404969 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" containerName="installer" Mar 07 07:06:01 crc kubenswrapper[4738]: E0307 07:06:01.404979 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" containerName="controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.404988 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" containerName="controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: E0307 07:06:01.405004 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7bb0bc1-7992-4779-9ed5-23109e61365a" containerName="route-controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405011 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7bb0bc1-7992-4779-9ed5-23109e61365a" containerName="route-controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: E0307 07:06:01.405027 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405034 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 07:06:01 crc kubenswrapper[4738]: E0307 07:06:01.405044 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405051 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405176 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405189 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ef65b2a-78bf-4efc-9adf-84472dbddb49" containerName="controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405202 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="03158b9d-6cc1-4f5d-af1f-d21de41d536d" containerName="installer" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405213 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd2b6e69-054d-461b-8a1d-ca38261a83d3" containerName="oauth-openshift" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405226 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7bb0bc1-7992-4779-9ed5-23109e61365a" containerName="route-controller-manager" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.405639 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.407437 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.407726 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.407978 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.412290 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.412878 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417224 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417427 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417522 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417672 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417687 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.417817 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.419496 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-588b9dc876-pt9tw"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.420272 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.424307 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.424529 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.424628 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.426301 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-66p2d"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.433268 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.434705 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.434705 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.435007 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.435179 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.436741 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-588b9dc876-pt9tw"] Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.443416 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.443620 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.443843 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.450402 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.450981 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.459349 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.467431 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.484314 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.508501 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.520348 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.548837 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.559995 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560038 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-audit-policies\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560439 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpw9g\" (UniqueName: \"kubernetes.io/projected/21aa723c-98cc-43d1-8579-b41031726681-kube-api-access-rpw9g\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560507 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560533 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jfdd\" (UniqueName: \"kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd\") pod \"auto-csr-approver-29547786-66p2d\" (UID: \"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4\") " pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560565 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8kn9\" (UniqueName: \"kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560593 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-router-certs\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560608 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-error\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560632 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-service-ca\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560672 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560696 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560721 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560762 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-session\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.560962 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21aa723c-98cc-43d1-8579-b41031726681-audit-dir\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.561010 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-login\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.561044 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.561072 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.561096 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.561115 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.662576 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.662984 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-audit-policies\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663084 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpw9g\" (UniqueName: \"kubernetes.io/projected/21aa723c-98cc-43d1-8579-b41031726681-kube-api-access-rpw9g\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663191 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663283 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jfdd\" (UniqueName: \"kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd\") pod \"auto-csr-approver-29547786-66p2d\" (UID: \"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4\") " pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663368 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8kn9\" (UniqueName: \"kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663446 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-router-certs\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663517 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-error\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663602 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-service-ca\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663689 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663763 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663841 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.663927 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-session\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664008 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21aa723c-98cc-43d1-8579-b41031726681-audit-dir\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664095 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-login\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664275 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664356 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.664443 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.665300 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/21aa723c-98cc-43d1-8579-b41031726681-audit-dir\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.665984 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-service-ca\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.666318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.666673 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-audit-policies\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.667801 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.667920 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.668073 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-cliconfig\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.670639 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.672495 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-session\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.672735 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-login\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.674537 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.678707 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-router-certs\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.680116 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-error\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.680706 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.686051 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpw9g\" (UniqueName: \"kubernetes.io/projected/21aa723c-98cc-43d1-8579-b41031726681-kube-api-access-rpw9g\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.687472 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.688087 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/21aa723c-98cc-43d1-8579-b41031726681-v4-0-config-system-serving-cert\") pod \"oauth-openshift-588b9dc876-pt9tw\" (UID: \"21aa723c-98cc-43d1-8579-b41031726681\") " pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.692533 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8kn9\" (UniqueName: \"kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9\") pod \"route-controller-manager-94986fdd4-7ngvm\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.701269 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jfdd\" (UniqueName: \"kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd\") pod \"auto-csr-approver-29547786-66p2d\" (UID: \"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4\") " pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.721564 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.752521 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.781718 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.784437 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.823821 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.927659 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.941806 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 07:06:01 crc kubenswrapper[4738]: I0307 07:06:01.966447 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.052763 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.114644 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.155397 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-66p2d"] Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.186347 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.194317 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:02 crc kubenswrapper[4738]: W0307 07:06:02.196444 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59183d23_c5a5_49cb_85a5_92bde3da9e12.slice/crio-1fbb72305780bea4782ca43e3b365b838c3f05e044474307dedec8adb4d30b52 WatchSource:0}: Error finding container 1fbb72305780bea4782ca43e3b365b838c3f05e044474307dedec8adb4d30b52: Status 404 returned error can't find the container with id 1fbb72305780bea4782ca43e3b365b838c3f05e044474307dedec8adb4d30b52 Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.252145 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-588b9dc876-pt9tw"] Mar 07 07:06:02 crc kubenswrapper[4738]: W0307 07:06:02.256903 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21aa723c_98cc_43d1_8579_b41031726681.slice/crio-16551ea72d0f63f844a61f7ed512d196c932e907aaddb045fdec9f8efda2e3e7 WatchSource:0}: Error finding container 16551ea72d0f63f844a61f7ed512d196c932e907aaddb045fdec9f8efda2e3e7: Status 404 returned error can't find the container with id 16551ea72d0f63f844a61f7ed512d196c932e907aaddb045fdec9f8efda2e3e7 Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.367981 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.390021 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.396150 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.401713 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.437333 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.493782 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.496725 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.500925 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.541487 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.596594 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.624647 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.710136 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.710779 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.713073 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.713072 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.713206 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.713610 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.715592 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.716214 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.722519 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.728546 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.759472 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-66p2d" event={"ID":"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4","Type":"ContainerStarted","Data":"9ac71f02458f45a610ea3ba5797018b6a3325341da7b31229a03466f5b458c07"} Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.762234 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" event={"ID":"59183d23-c5a5-49cb-85a5-92bde3da9e12","Type":"ContainerStarted","Data":"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964"} Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.762266 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" event={"ID":"59183d23-c5a5-49cb-85a5-92bde3da9e12","Type":"ContainerStarted","Data":"1fbb72305780bea4782ca43e3b365b838c3f05e044474307dedec8adb4d30b52"} Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.762869 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.766481 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" event={"ID":"21aa723c-98cc-43d1-8579-b41031726681","Type":"ContainerStarted","Data":"db4f607c4ffa4495263665fb0e2bcd4172bfa6fc2c3e64ff4ff75a7e767c4328"} Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.766537 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" event={"ID":"21aa723c-98cc-43d1-8579-b41031726681","Type":"ContainerStarted","Data":"16551ea72d0f63f844a61f7ed512d196c932e907aaddb045fdec9f8efda2e3e7"} Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.766545 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.766966 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.771145 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.795302 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" podStartSLOduration=8.795271987 podStartE2EDuration="8.795271987s" podCreationTimestamp="2026-03-07 07:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:02.786957283 +0000 UTC m=+381.251944614" watchObservedRunningTime="2026-03-07 07:06:02.795271987 +0000 UTC m=+381.260259348" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.817269 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" podStartSLOduration=67.817249829 podStartE2EDuration="1m7.817249829s" podCreationTimestamp="2026-03-07 07:04:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:02.814495755 +0000 UTC m=+381.279483096" watchObservedRunningTime="2026-03-07 07:06:02.817249829 +0000 UTC m=+381.282237170" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.881673 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.881722 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmxfk\" (UniqueName: \"kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.881819 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.881883 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.881938 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.928652 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-588b9dc876-pt9tw" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.987359 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.987449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmxfk\" (UniqueName: \"kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.987565 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.987608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.987643 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.989051 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.990424 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:02 crc kubenswrapper[4738]: I0307 07:06:02.990543 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.013669 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.024634 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmxfk\" (UniqueName: \"kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk\") pod \"controller-manager-79875d7946-wtcdx\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.038625 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.044356 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.189015 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.205840 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.206564 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.232702 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:03 crc kubenswrapper[4738]: W0307 07:06:03.237991 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5b28281_00ad_468a_8a6c_40c5930bc52f.slice/crio-1f94d68531b8417f1e838880b8b5ca997f9033decc81168ac8419ed0f9fff431 WatchSource:0}: Error finding container 1f94d68531b8417f1e838880b8b5ca997f9033decc81168ac8419ed0f9fff431: Status 404 returned error can't find the container with id 1f94d68531b8417f1e838880b8b5ca997f9033decc81168ac8419ed0f9fff431 Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.384873 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.420686 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.542783 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.721217 4738 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.745681 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.772880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" event={"ID":"f5b28281-00ad-468a-8a6c-40c5930bc52f","Type":"ContainerStarted","Data":"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb"} Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.773196 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" event={"ID":"f5b28281-00ad-468a-8a6c-40c5930bc52f","Type":"ContainerStarted","Data":"1f94d68531b8417f1e838880b8b5ca997f9033decc81168ac8419ed0f9fff431"} Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.773688 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.775302 4738 generic.go:334] "Generic (PLEG): container finished" podID="b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" containerID="b22ad71b7f687dd90c5dd50584cc04290b1dfe21234cb408d282b7612dfb18f0" exitCode=0 Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.775625 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-66p2d" event={"ID":"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4","Type":"ContainerDied","Data":"b22ad71b7f687dd90c5dd50584cc04290b1dfe21234cb408d282b7612dfb18f0"} Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.778255 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.822506 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" podStartSLOduration=9.822481732 podStartE2EDuration="9.822481732s" podCreationTimestamp="2026-03-07 07:05:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:03.802501555 +0000 UTC m=+382.267488876" watchObservedRunningTime="2026-03-07 07:06:03.822481732 +0000 UTC m=+382.287469053" Mar 07 07:06:03 crc kubenswrapper[4738]: I0307 07:06:03.930435 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.095492 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.287085 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.296449 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.326871 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.393287 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.394075 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.406576 4738 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.442440 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.465429 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.489317 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.621791 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.704087 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.708805 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.751835 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.760359 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.812815 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 07:06:04 crc kubenswrapper[4738]: I0307 07:06:04.958560 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.095341 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.113408 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jfdd\" (UniqueName: \"kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd\") pod \"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4\" (UID: \"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4\") " Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.120290 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd" (OuterVolumeSpecName: "kube-api-access-9jfdd") pod "b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" (UID: "b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4"). InnerVolumeSpecName "kube-api-access-9jfdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.214813 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jfdd\" (UniqueName: \"kubernetes.io/projected/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4-kube-api-access-9jfdd\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.400704 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.497555 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.568192 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.629959 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.791824 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547786-66p2d" event={"ID":"b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4","Type":"ContainerDied","Data":"9ac71f02458f45a610ea3ba5797018b6a3325341da7b31229a03466f5b458c07"} Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.791885 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ac71f02458f45a610ea3ba5797018b6a3325341da7b31229a03466f5b458c07" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.792534 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547786-66p2d" Mar 07 07:06:05 crc kubenswrapper[4738]: I0307 07:06:05.961442 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.096149 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.247250 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.309579 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.617065 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.617312 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.715096 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.728403 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.728513 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.742328 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.775612 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.800147 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.800214 4738 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f" exitCode=137 Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.800253 4738 scope.go:117] "RemoveContainer" containerID="3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.800297 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.816564 4738 scope.go:117] "RemoveContainer" containerID="3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f" Mar 07 07:06:06 crc kubenswrapper[4738]: E0307 07:06:06.816936 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f\": container with ID starting with 3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f not found: ID does not exist" containerID="3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.816976 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f"} err="failed to get container status \"3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f\": rpc error: code = NotFound desc = could not find container \"3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f\": container with ID starting with 3eec800aab5f2d2bb340840c98b491de29da4e5b97b3ebf9f27411d4def6423f not found: ID does not exist" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.836950 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837006 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837056 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837132 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837145 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837237 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837327 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837349 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837426 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837592 4738 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837606 4738 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837616 4738 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.837624 4738 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.847977 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:06:06 crc kubenswrapper[4738]: I0307 07:06:06.938794 4738 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:07 crc kubenswrapper[4738]: I0307 07:06:07.843948 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 07:06:07 crc kubenswrapper[4738]: I0307 07:06:07.930465 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.142633 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.395331 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.395759 4738 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.406353 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.406407 4738 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="341742f3-e41d-4d37-9f6e-8d5ccab6687c" Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.410368 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 07 07:06:08 crc kubenswrapper[4738]: I0307 07:06:08.410416 4738 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="341742f3-e41d-4d37-9f6e-8d5ccab6687c" Mar 07 07:06:13 crc kubenswrapper[4738]: I0307 07:06:13.971718 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:13 crc kubenswrapper[4738]: I0307 07:06:13.972344 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" podUID="f5b28281-00ad-468a-8a6c-40c5930bc52f" containerName="controller-manager" containerID="cri-o://fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb" gracePeriod=30 Mar 07 07:06:13 crc kubenswrapper[4738]: I0307 07:06:13.985642 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:13 crc kubenswrapper[4738]: I0307 07:06:13.985881 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" podUID="59183d23-c5a5-49cb-85a5-92bde3da9e12" containerName="route-controller-manager" containerID="cri-o://9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964" gracePeriod=30 Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.564119 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.571808 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651536 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles\") pod \"f5b28281-00ad-468a-8a6c-40c5930bc52f\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651601 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca\") pod \"59183d23-c5a5-49cb-85a5-92bde3da9e12\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651648 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert\") pod \"f5b28281-00ad-468a-8a6c-40c5930bc52f\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651684 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config\") pod \"f5b28281-00ad-468a-8a6c-40c5930bc52f\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651726 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config\") pod \"59183d23-c5a5-49cb-85a5-92bde3da9e12\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651751 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca\") pod \"f5b28281-00ad-468a-8a6c-40c5930bc52f\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651785 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert\") pod \"59183d23-c5a5-49cb-85a5-92bde3da9e12\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651820 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmxfk\" (UniqueName: \"kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk\") pod \"f5b28281-00ad-468a-8a6c-40c5930bc52f\" (UID: \"f5b28281-00ad-468a-8a6c-40c5930bc52f\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.651845 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8kn9\" (UniqueName: \"kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9\") pod \"59183d23-c5a5-49cb-85a5-92bde3da9e12\" (UID: \"59183d23-c5a5-49cb-85a5-92bde3da9e12\") " Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.652806 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca" (OuterVolumeSpecName: "client-ca") pod "f5b28281-00ad-468a-8a6c-40c5930bc52f" (UID: "f5b28281-00ad-468a-8a6c-40c5930bc52f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.652855 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f5b28281-00ad-468a-8a6c-40c5930bc52f" (UID: "f5b28281-00ad-468a-8a6c-40c5930bc52f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.653038 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config" (OuterVolumeSpecName: "config") pod "f5b28281-00ad-468a-8a6c-40c5930bc52f" (UID: "f5b28281-00ad-468a-8a6c-40c5930bc52f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.653067 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca" (OuterVolumeSpecName: "client-ca") pod "59183d23-c5a5-49cb-85a5-92bde3da9e12" (UID: "59183d23-c5a5-49cb-85a5-92bde3da9e12"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.653219 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config" (OuterVolumeSpecName: "config") pod "59183d23-c5a5-49cb-85a5-92bde3da9e12" (UID: "59183d23-c5a5-49cb-85a5-92bde3da9e12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.657185 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "59183d23-c5a5-49cb-85a5-92bde3da9e12" (UID: "59183d23-c5a5-49cb-85a5-92bde3da9e12"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.658332 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9" (OuterVolumeSpecName: "kube-api-access-q8kn9") pod "59183d23-c5a5-49cb-85a5-92bde3da9e12" (UID: "59183d23-c5a5-49cb-85a5-92bde3da9e12"). InnerVolumeSpecName "kube-api-access-q8kn9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.658802 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk" (OuterVolumeSpecName: "kube-api-access-pmxfk") pod "f5b28281-00ad-468a-8a6c-40c5930bc52f" (UID: "f5b28281-00ad-468a-8a6c-40c5930bc52f"). InnerVolumeSpecName "kube-api-access-pmxfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.658858 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f5b28281-00ad-468a-8a6c-40c5930bc52f" (UID: "f5b28281-00ad-468a-8a6c-40c5930bc52f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753893 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753926 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753936 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f5b28281-00ad-468a-8a6c-40c5930bc52f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753946 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753955 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/59183d23-c5a5-49cb-85a5-92bde3da9e12-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753966 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f5b28281-00ad-468a-8a6c-40c5930bc52f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753975 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/59183d23-c5a5-49cb-85a5-92bde3da9e12-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753983 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmxfk\" (UniqueName: \"kubernetes.io/projected/f5b28281-00ad-468a-8a6c-40c5930bc52f-kube-api-access-pmxfk\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.753993 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8kn9\" (UniqueName: \"kubernetes.io/projected/59183d23-c5a5-49cb-85a5-92bde3da9e12-kube-api-access-q8kn9\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.849923 4738 generic.go:334] "Generic (PLEG): container finished" podID="f5b28281-00ad-468a-8a6c-40c5930bc52f" containerID="fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb" exitCode=0 Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.849982 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" event={"ID":"f5b28281-00ad-468a-8a6c-40c5930bc52f","Type":"ContainerDied","Data":"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb"} Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.850018 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" event={"ID":"f5b28281-00ad-468a-8a6c-40c5930bc52f","Type":"ContainerDied","Data":"1f94d68531b8417f1e838880b8b5ca997f9033decc81168ac8419ed0f9fff431"} Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.850040 4738 scope.go:117] "RemoveContainer" containerID="fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.850138 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-wtcdx" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.854780 4738 generic.go:334] "Generic (PLEG): container finished" podID="59183d23-c5a5-49cb-85a5-92bde3da9e12" containerID="9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964" exitCode=0 Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.854830 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.854876 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" event={"ID":"59183d23-c5a5-49cb-85a5-92bde3da9e12","Type":"ContainerDied","Data":"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964"} Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.854964 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm" event={"ID":"59183d23-c5a5-49cb-85a5-92bde3da9e12","Type":"ContainerDied","Data":"1fbb72305780bea4782ca43e3b365b838c3f05e044474307dedec8adb4d30b52"} Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.873141 4738 scope.go:117] "RemoveContainer" containerID="fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb" Mar 07 07:06:14 crc kubenswrapper[4738]: E0307 07:06:14.873580 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb\": container with ID starting with fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb not found: ID does not exist" containerID="fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.873619 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb"} err="failed to get container status \"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb\": rpc error: code = NotFound desc = could not find container \"fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb\": container with ID starting with fd7759967742847d562ec93d52bb4bb05774216f1ba865eea2ab8c363234ccdb not found: ID does not exist" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.873644 4738 scope.go:117] "RemoveContainer" containerID="9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.882300 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.886462 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-wtcdx"] Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.891579 4738 scope.go:117] "RemoveContainer" containerID="9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964" Mar 07 07:06:14 crc kubenswrapper[4738]: E0307 07:06:14.892154 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964\": container with ID starting with 9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964 not found: ID does not exist" containerID="9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.892210 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964"} err="failed to get container status \"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964\": rpc error: code = NotFound desc = could not find container \"9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964\": container with ID starting with 9ae070384c338536c4463069cfb3642007c943d45cc3e5036cddfdeaa2c77964 not found: ID does not exist" Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.895543 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:14 crc kubenswrapper[4738]: I0307 07:06:14.898346 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-7ngvm"] Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.716583 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:15 crc kubenswrapper[4738]: E0307 07:06:15.717181 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" containerName="oc" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717197 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" containerName="oc" Mar 07 07:06:15 crc kubenswrapper[4738]: E0307 07:06:15.717235 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5b28281-00ad-468a-8a6c-40c5930bc52f" containerName="controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717242 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5b28281-00ad-468a-8a6c-40c5930bc52f" containerName="controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: E0307 07:06:15.717256 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59183d23-c5a5-49cb-85a5-92bde3da9e12" containerName="route-controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717263 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="59183d23-c5a5-49cb-85a5-92bde3da9e12" containerName="route-controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717361 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" containerName="oc" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717371 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="59183d23-c5a5-49cb-85a5-92bde3da9e12" containerName="route-controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717381 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5b28281-00ad-468a-8a6c-40c5930bc52f" containerName="controller-manager" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.717821 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.720038 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.721299 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.721648 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.722088 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.722108 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.722825 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.723862 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.725752 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.726860 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.728257 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.732787 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.734825 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.747530 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.747970 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.748950 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.751944 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.756259 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.767907 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g98jc\" (UniqueName: \"kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.767962 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.767998 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768030 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768054 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768093 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768119 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768144 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.768189 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btfzg\" (UniqueName: \"kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870020 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g98jc\" (UniqueName: \"kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870098 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870138 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870208 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870242 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870296 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870334 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870368 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.870399 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btfzg\" (UniqueName: \"kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.871851 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.872447 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.872502 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.872912 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.873212 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.880420 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.889509 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g98jc\" (UniqueName: \"kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc\") pod \"controller-manager-6dbd4c5b74-72gwr\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.890442 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:15 crc kubenswrapper[4738]: I0307 07:06:15.899765 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btfzg\" (UniqueName: \"kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg\") pod \"route-controller-manager-778677f976-7cv6v\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.046392 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.063174 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.283642 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.396740 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59183d23-c5a5-49cb-85a5-92bde3da9e12" path="/var/lib/kubelet/pods/59183d23-c5a5-49cb-85a5-92bde3da9e12/volumes" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.398957 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5b28281-00ad-468a-8a6c-40c5930bc52f" path="/var/lib/kubelet/pods/f5b28281-00ad-468a-8a6c-40c5930bc52f/volumes" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.569572 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.872962 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" event={"ID":"adcc31ea-465b-406d-be10-3e44ad4c3031","Type":"ContainerStarted","Data":"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f"} Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.873297 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" event={"ID":"adcc31ea-465b-406d-be10-3e44ad4c3031","Type":"ContainerStarted","Data":"3132759f9dc992a67416da6cba6d2b73e682c3b60fb0487572f6ea4b8f038b77"} Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.873316 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.878058 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" event={"ID":"8bcdcae6-ffd9-461b-a0e1-272c93108787","Type":"ContainerStarted","Data":"8578c250b4bb96ac0338c14d75d881031f5bc1c47f28f0fcc11bc1b583a820c6"} Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.878122 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" event={"ID":"8bcdcae6-ffd9-461b-a0e1-272c93108787","Type":"ContainerStarted","Data":"1f9e3fac3777ec40082dec6bb0d219326b05d2e8b84e7e7357400de85c4c4011"} Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.878310 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.884271 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:16 crc kubenswrapper[4738]: I0307 07:06:16.891771 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" podStartSLOduration=2.891751552 podStartE2EDuration="2.891751552s" podCreationTimestamp="2026-03-07 07:06:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:16.888792352 +0000 UTC m=+395.353779683" watchObservedRunningTime="2026-03-07 07:06:16.891751552 +0000 UTC m=+395.356738883" Mar 07 07:06:17 crc kubenswrapper[4738]: I0307 07:06:17.512926 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:17 crc kubenswrapper[4738]: I0307 07:06:17.536708 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" podStartSLOduration=4.536676625 podStartE2EDuration="4.536676625s" podCreationTimestamp="2026-03-07 07:06:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:16.917501785 +0000 UTC m=+395.382489106" watchObservedRunningTime="2026-03-07 07:06:17.536676625 +0000 UTC m=+396.001663936" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.119000 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.120326 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" podUID="adcc31ea-465b-406d-be10-3e44ad4c3031" containerName="route-controller-manager" containerID="cri-o://92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f" gracePeriod=30 Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.602701 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.670464 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config\") pod \"adcc31ea-465b-406d-be10-3e44ad4c3031\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.670712 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca\") pod \"adcc31ea-465b-406d-be10-3e44ad4c3031\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.670863 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert\") pod \"adcc31ea-465b-406d-be10-3e44ad4c3031\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.671042 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btfzg\" (UniqueName: \"kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg\") pod \"adcc31ea-465b-406d-be10-3e44ad4c3031\" (UID: \"adcc31ea-465b-406d-be10-3e44ad4c3031\") " Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.671278 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config" (OuterVolumeSpecName: "config") pod "adcc31ea-465b-406d-be10-3e44ad4c3031" (UID: "adcc31ea-465b-406d-be10-3e44ad4c3031"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.671422 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca" (OuterVolumeSpecName: "client-ca") pod "adcc31ea-465b-406d-be10-3e44ad4c3031" (UID: "adcc31ea-465b-406d-be10-3e44ad4c3031"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.671588 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.677614 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "adcc31ea-465b-406d-be10-3e44ad4c3031" (UID: "adcc31ea-465b-406d-be10-3e44ad4c3031"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.685307 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg" (OuterVolumeSpecName: "kube-api-access-btfzg") pod "adcc31ea-465b-406d-be10-3e44ad4c3031" (UID: "adcc31ea-465b-406d-be10-3e44ad4c3031"). InnerVolumeSpecName "kube-api-access-btfzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.773459 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btfzg\" (UniqueName: \"kubernetes.io/projected/adcc31ea-465b-406d-be10-3e44ad4c3031-kube-api-access-btfzg\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.773505 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/adcc31ea-465b-406d-be10-3e44ad4c3031-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.773520 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/adcc31ea-465b-406d-be10-3e44ad4c3031-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.987065 4738 generic.go:334] "Generic (PLEG): container finished" podID="adcc31ea-465b-406d-be10-3e44ad4c3031" containerID="92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f" exitCode=0 Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.987133 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" event={"ID":"adcc31ea-465b-406d-be10-3e44ad4c3031","Type":"ContainerDied","Data":"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f"} Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.987216 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" event={"ID":"adcc31ea-465b-406d-be10-3e44ad4c3031","Type":"ContainerDied","Data":"3132759f9dc992a67416da6cba6d2b73e682c3b60fb0487572f6ea4b8f038b77"} Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.987256 4738 scope.go:117] "RemoveContainer" containerID="92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f" Mar 07 07:06:34 crc kubenswrapper[4738]: I0307 07:06:34.987471 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.005960 4738 scope.go:117] "RemoveContainer" containerID="92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f" Mar 07 07:06:35 crc kubenswrapper[4738]: E0307 07:06:35.006593 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f\": container with ID starting with 92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f not found: ID does not exist" containerID="92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.006670 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f"} err="failed to get container status \"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f\": rpc error: code = NotFound desc = could not find container \"92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f\": container with ID starting with 92a76e68271f132f1c5f7822b28270b5b08995fc7277aa3694c6ff4c2ccf5d5f not found: ID does not exist" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.029023 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.043023 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-778677f976-7cv6v"] Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.728573 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75"] Mar 07 07:06:35 crc kubenswrapper[4738]: E0307 07:06:35.728909 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adcc31ea-465b-406d-be10-3e44ad4c3031" containerName="route-controller-manager" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.728924 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="adcc31ea-465b-406d-be10-3e44ad4c3031" containerName="route-controller-manager" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.729043 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="adcc31ea-465b-406d-be10-3e44ad4c3031" containerName="route-controller-manager" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.729648 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.737149 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.737229 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.737238 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.737504 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.737544 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.738146 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.748786 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75"] Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.787591 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-config\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.787883 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-client-ca\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.787970 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c529c016-d371-48e8-83b5-f6b05a9bc589-serving-cert\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.788021 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k242x\" (UniqueName: \"kubernetes.io/projected/c529c016-d371-48e8-83b5-f6b05a9bc589-kube-api-access-k242x\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.889852 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-config\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.889991 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-client-ca\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.890037 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c529c016-d371-48e8-83b5-f6b05a9bc589-serving-cert\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.890060 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k242x\" (UniqueName: \"kubernetes.io/projected/c529c016-d371-48e8-83b5-f6b05a9bc589-kube-api-access-k242x\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.891295 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-config\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.892467 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c529c016-d371-48e8-83b5-f6b05a9bc589-client-ca\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.895369 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c529c016-d371-48e8-83b5-f6b05a9bc589-serving-cert\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:35 crc kubenswrapper[4738]: I0307 07:06:35.907621 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k242x\" (UniqueName: \"kubernetes.io/projected/c529c016-d371-48e8-83b5-f6b05a9bc589-kube-api-access-k242x\") pod \"route-controller-manager-94986fdd4-jhw75\" (UID: \"c529c016-d371-48e8-83b5-f6b05a9bc589\") " pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:36 crc kubenswrapper[4738]: I0307 07:06:36.051361 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:36 crc kubenswrapper[4738]: I0307 07:06:36.391728 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adcc31ea-465b-406d-be10-3e44ad4c3031" path="/var/lib/kubelet/pods/adcc31ea-465b-406d-be10-3e44ad4c3031/volumes" Mar 07 07:06:36 crc kubenswrapper[4738]: I0307 07:06:36.479178 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75"] Mar 07 07:06:36 crc kubenswrapper[4738]: W0307 07:06:36.485727 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc529c016_d371_48e8_83b5_f6b05a9bc589.slice/crio-d903eb214de1bb7ebda7d622de2cdd8825f69309e2d8dea56a18190b01efb0c6 WatchSource:0}: Error finding container d903eb214de1bb7ebda7d622de2cdd8825f69309e2d8dea56a18190b01efb0c6: Status 404 returned error can't find the container with id d903eb214de1bb7ebda7d622de2cdd8825f69309e2d8dea56a18190b01efb0c6 Mar 07 07:06:37 crc kubenswrapper[4738]: I0307 07:06:37.002391 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" event={"ID":"c529c016-d371-48e8-83b5-f6b05a9bc589","Type":"ContainerStarted","Data":"76ce7f22cd28331fedb2c2fad35520e7aea45aec29f3fa0e54f2105205926c32"} Mar 07 07:06:37 crc kubenswrapper[4738]: I0307 07:06:37.003534 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:37 crc kubenswrapper[4738]: I0307 07:06:37.003556 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" event={"ID":"c529c016-d371-48e8-83b5-f6b05a9bc589","Type":"ContainerStarted","Data":"d903eb214de1bb7ebda7d622de2cdd8825f69309e2d8dea56a18190b01efb0c6"} Mar 07 07:06:37 crc kubenswrapper[4738]: I0307 07:06:37.028401 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" podStartSLOduration=3.028357941 podStartE2EDuration="3.028357941s" podCreationTimestamp="2026-03-07 07:06:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:37.021509747 +0000 UTC m=+415.486497138" watchObservedRunningTime="2026-03-07 07:06:37.028357941 +0000 UTC m=+415.493345302" Mar 07 07:06:37 crc kubenswrapper[4738]: I0307 07:06:37.270240 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-94986fdd4-jhw75" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.679047 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.680673 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-t6bh9" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="registry-server" containerID="cri-o://d5b52c7a1fae65f0a2915cfbf369956f042ec4e9468ba591e2b6e6d7fb92015e" gracePeriod=30 Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.689215 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.689516 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-587ww" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="registry-server" containerID="cri-o://897ac86f59ba3751505c36a9676f6c3a0f804e530e6fe99db330195d77a63ea8" gracePeriod=30 Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.711259 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.711762 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" containerID="cri-o://0630bea3371a28de009e9ab4e1f64a0ceb86668c7a022aa74187c40010202ef7" gracePeriod=30 Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.718203 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.718571 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rnt9k" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="registry-server" containerID="cri-o://876bf0bfd99e9f59e4ce96d97e0a1b03b380e51fb8e6acc67097ce1612d9d83c" gracePeriod=30 Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.727602 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.728011 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gfmjp" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="registry-server" containerID="cri-o://783cd768a6c50c8f4785a3e7d7a800f9fda331155faab9f13072762afb0b11c6" gracePeriod=30 Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.732131 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qzsqg"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.733765 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.741411 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qzsqg"] Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.785063 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.785160 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdg77\" (UniqueName: \"kubernetes.io/projected/16598a41-f8be-4b82-84c1-b718c0b24b8e-kube-api-access-fdg77\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.785197 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.890169 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.890318 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdg77\" (UniqueName: \"kubernetes.io/projected/16598a41-f8be-4b82-84c1-b718c0b24b8e-kube-api-access-fdg77\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.890350 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.892982 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.904098 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/16598a41-f8be-4b82-84c1-b718c0b24b8e-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:49 crc kubenswrapper[4738]: I0307 07:06:49.952340 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdg77\" (UniqueName: \"kubernetes.io/projected/16598a41-f8be-4b82-84c1-b718c0b24b8e-kube-api-access-fdg77\") pod \"marketplace-operator-79b997595-qzsqg\" (UID: \"16598a41-f8be-4b82-84c1-b718c0b24b8e\") " pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.095603 4738 generic.go:334] "Generic (PLEG): container finished" podID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerID="897ac86f59ba3751505c36a9676f6c3a0f804e530e6fe99db330195d77a63ea8" exitCode=0 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.095694 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerDied","Data":"897ac86f59ba3751505c36a9676f6c3a0f804e530e6fe99db330195d77a63ea8"} Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.101507 4738 generic.go:334] "Generic (PLEG): container finished" podID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerID="876bf0bfd99e9f59e4ce96d97e0a1b03b380e51fb8e6acc67097ce1612d9d83c" exitCode=0 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.101573 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerDied","Data":"876bf0bfd99e9f59e4ce96d97e0a1b03b380e51fb8e6acc67097ce1612d9d83c"} Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.105547 4738 generic.go:334] "Generic (PLEG): container finished" podID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerID="0630bea3371a28de009e9ab4e1f64a0ceb86668c7a022aa74187c40010202ef7" exitCode=0 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.105628 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" event={"ID":"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc","Type":"ContainerDied","Data":"0630bea3371a28de009e9ab4e1f64a0ceb86668c7a022aa74187c40010202ef7"} Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.109491 4738 generic.go:334] "Generic (PLEG): container finished" podID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerID="d5b52c7a1fae65f0a2915cfbf369956f042ec4e9468ba591e2b6e6d7fb92015e" exitCode=0 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.109615 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerDied","Data":"d5b52c7a1fae65f0a2915cfbf369956f042ec4e9468ba591e2b6e6d7fb92015e"} Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.125634 4738 generic.go:334] "Generic (PLEG): container finished" podID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerID="783cd768a6c50c8f4785a3e7d7a800f9fda331155faab9f13072762afb0b11c6" exitCode=0 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.125683 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerDied","Data":"783cd768a6c50c8f4785a3e7d7a800f9fda331155faab9f13072762afb0b11c6"} Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.173579 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.196363 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.295266 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbhg8\" (UniqueName: \"kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8\") pod \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.295379 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content\") pod \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.295518 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities\") pod \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\" (UID: \"f34dc9c2-f90f-473e-8dca-a3df3f70e02f\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.297099 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities" (OuterVolumeSpecName: "utilities") pod "f34dc9c2-f90f-473e-8dca-a3df3f70e02f" (UID: "f34dc9c2-f90f-473e-8dca-a3df3f70e02f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.310211 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8" (OuterVolumeSpecName: "kube-api-access-zbhg8") pod "f34dc9c2-f90f-473e-8dca-a3df3f70e02f" (UID: "f34dc9c2-f90f-473e-8dca-a3df3f70e02f"). InnerVolumeSpecName "kube-api-access-zbhg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.360298 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f34dc9c2-f90f-473e-8dca-a3df3f70e02f" (UID: "f34dc9c2-f90f-473e-8dca-a3df3f70e02f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.403002 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.403076 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbhg8\" (UniqueName: \"kubernetes.io/projected/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-kube-api-access-zbhg8\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.403101 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f34dc9c2-f90f-473e-8dca-a3df3f70e02f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.540926 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.574446 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.600749 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.605055 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kwcq\" (UniqueName: \"kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq\") pod \"b82c0da7-caec-462d-85e9-f3c45cc042b5\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.605375 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content\") pod \"b82c0da7-caec-462d-85e9-f3c45cc042b5\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.605495 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities\") pod \"b82c0da7-caec-462d-85e9-f3c45cc042b5\" (UID: \"b82c0da7-caec-462d-85e9-f3c45cc042b5\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.606433 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities" (OuterVolumeSpecName: "utilities") pod "b82c0da7-caec-462d-85e9-f3c45cc042b5" (UID: "b82c0da7-caec-462d-85e9-f3c45cc042b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.607481 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.610312 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq" (OuterVolumeSpecName: "kube-api-access-6kwcq") pod "b82c0da7-caec-462d-85e9-f3c45cc042b5" (UID: "b82c0da7-caec-462d-85e9-f3c45cc042b5"). InnerVolumeSpecName "kube-api-access-6kwcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.683910 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b82c0da7-caec-462d-85e9-f3c45cc042b5" (UID: "b82c0da7-caec-462d-85e9-f3c45cc042b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.706415 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics\") pod \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.706679 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca\") pod \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.706762 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8ph6\" (UniqueName: \"kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6\") pod \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.706846 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities\") pod \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.706921 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content\") pod \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\" (UID: \"2731d6f8-adb2-4068-a3c9-162cfcb7de07\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707041 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities\") pod \"4ccab57d-f355-494a-adae-5a1dba9c360a\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707299 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content\") pod \"4ccab57d-f355-494a-adae-5a1dba9c360a\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707399 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6b76v\" (UniqueName: \"kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v\") pod \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\" (UID: \"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707510 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2w9b\" (UniqueName: \"kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b\") pod \"4ccab57d-f355-494a-adae-5a1dba9c360a\" (UID: \"4ccab57d-f355-494a-adae-5a1dba9c360a\") " Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707363 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" (UID: "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707686 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities" (OuterVolumeSpecName: "utilities") pod "2731d6f8-adb2-4068-a3c9-162cfcb7de07" (UID: "2731d6f8-adb2-4068-a3c9-162cfcb7de07"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.707945 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities" (OuterVolumeSpecName: "utilities") pod "4ccab57d-f355-494a-adae-5a1dba9c360a" (UID: "4ccab57d-f355-494a-adae-5a1dba9c360a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.709318 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.709563 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.710009 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b82c0da7-caec-462d-85e9-f3c45cc042b5-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.710093 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kwcq\" (UniqueName: \"kubernetes.io/projected/b82c0da7-caec-462d-85e9-f3c45cc042b5-kube-api-access-6kwcq\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.710185 4738 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.710262 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.716028 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qzsqg"] Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.716119 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b" (OuterVolumeSpecName: "kube-api-access-x2w9b") pod "4ccab57d-f355-494a-adae-5a1dba9c360a" (UID: "4ccab57d-f355-494a-adae-5a1dba9c360a"). InnerVolumeSpecName "kube-api-access-x2w9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.716188 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6" (OuterVolumeSpecName: "kube-api-access-m8ph6") pod "2731d6f8-adb2-4068-a3c9-162cfcb7de07" (UID: "2731d6f8-adb2-4068-a3c9-162cfcb7de07"). InnerVolumeSpecName "kube-api-access-m8ph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.716435 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v" (OuterVolumeSpecName: "kube-api-access-6b76v") pod "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" (UID: "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc"). InnerVolumeSpecName "kube-api-access-6b76v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.716548 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" (UID: "6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: W0307 07:06:50.718846 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16598a41_f8be_4b82_84c1_b718c0b24b8e.slice/crio-07838b1a187f7bc82e8861d629efebe907f20e9904122535ad623f5536cd7cd6 WatchSource:0}: Error finding container 07838b1a187f7bc82e8861d629efebe907f20e9904122535ad623f5536cd7cd6: Status 404 returned error can't find the container with id 07838b1a187f7bc82e8861d629efebe907f20e9904122535ad623f5536cd7cd6 Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.746411 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ccab57d-f355-494a-adae-5a1dba9c360a" (UID: "4ccab57d-f355-494a-adae-5a1dba9c360a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.810897 4738 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.810931 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8ph6\" (UniqueName: \"kubernetes.io/projected/2731d6f8-adb2-4068-a3c9-162cfcb7de07-kube-api-access-m8ph6\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.810940 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ccab57d-f355-494a-adae-5a1dba9c360a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.810951 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6b76v\" (UniqueName: \"kubernetes.io/projected/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc-kube-api-access-6b76v\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.810959 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2w9b\" (UniqueName: \"kubernetes.io/projected/4ccab57d-f355-494a-adae-5a1dba9c360a-kube-api-access-x2w9b\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.844290 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2731d6f8-adb2-4068-a3c9-162cfcb7de07" (UID: "2731d6f8-adb2-4068-a3c9-162cfcb7de07"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:06:50 crc kubenswrapper[4738]: I0307 07:06:50.912589 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2731d6f8-adb2-4068-a3c9-162cfcb7de07-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.132828 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rnt9k" event={"ID":"4ccab57d-f355-494a-adae-5a1dba9c360a","Type":"ContainerDied","Data":"f9479bfa25053e3452ceb83bed98544fe94e3e84ca6e014f3db218720e3df08b"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.133233 4738 scope.go:117] "RemoveContainer" containerID="876bf0bfd99e9f59e4ce96d97e0a1b03b380e51fb8e6acc67097ce1612d9d83c" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.132879 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rnt9k" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.134485 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" event={"ID":"16598a41-f8be-4b82-84c1-b718c0b24b8e","Type":"ContainerStarted","Data":"47bb3fb77e6f3022907974a6e2a5a03e3e251508b5fc2ea251ca1f8d0463a727"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.134539 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" event={"ID":"16598a41-f8be-4b82-84c1-b718c0b24b8e","Type":"ContainerStarted","Data":"07838b1a187f7bc82e8861d629efebe907f20e9904122535ad623f5536cd7cd6"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.134715 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.136578 4738 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qzsqg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.74:8080/healthz\": dial tcp 10.217.0.74:8080: connect: connection refused" start-of-body= Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.136624 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" podUID="16598a41-f8be-4b82-84c1-b718c0b24b8e" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.74:8080/healthz\": dial tcp 10.217.0.74:8080: connect: connection refused" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.137409 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.137460 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-67thp" event={"ID":"6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc","Type":"ContainerDied","Data":"f1c44c3a2e101aa6aa6f78141d0e0c78c5231986b63f64bcf4884107a76effcb"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.141585 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-t6bh9" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.141597 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-t6bh9" event={"ID":"f34dc9c2-f90f-473e-8dca-a3df3f70e02f","Type":"ContainerDied","Data":"b50475eefaa9e7509eefbd849ea95eabe6da405b3716000557b5c4dac3dd82f9"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.145103 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gfmjp" event={"ID":"2731d6f8-adb2-4068-a3c9-162cfcb7de07","Type":"ContainerDied","Data":"55a2a1e2912bd12ed176110db8b60c8fc33ec3080e158bd64855676c21b3663b"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.145248 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gfmjp" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.152618 4738 scope.go:117] "RemoveContainer" containerID="f85e7306c0fe9641cacc9ce61234d007671bac5fc94b72ed6c37ad81f0fd4ccb" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.155423 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-587ww" event={"ID":"b82c0da7-caec-462d-85e9-f3c45cc042b5","Type":"ContainerDied","Data":"d3f135f1d2830f76a1c904bfa0d22a9cc8ce59fd39a839728b3f5119bb533a11"} Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.155496 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-587ww" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.173069 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" podStartSLOduration=2.173044964 podStartE2EDuration="2.173044964s" podCreationTimestamp="2026-03-07 07:06:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:51.171646776 +0000 UTC m=+429.636634097" watchObservedRunningTime="2026-03-07 07:06:51.173044964 +0000 UTC m=+429.638032295" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.183661 4738 scope.go:117] "RemoveContainer" containerID="48702995f20f0e43a4d64658a812c36ee8d8be5ef48a81f37af7c66d66fbc1d9" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.190956 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.204139 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-67thp"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.205536 4738 scope.go:117] "RemoveContainer" containerID="0630bea3371a28de009e9ab4e1f64a0ceb86668c7a022aa74187c40010202ef7" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.209339 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.212874 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-t6bh9"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.217291 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.221609 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rnt9k"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.231729 4738 scope.go:117] "RemoveContainer" containerID="d5b52c7a1fae65f0a2915cfbf369956f042ec4e9468ba591e2b6e6d7fb92015e" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.231853 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.237560 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gfmjp"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.252629 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.254410 4738 scope.go:117] "RemoveContainer" containerID="baef5f61b5c6a47da32c50681175f6adc2a830d69eda495f46c36fa665ff6691" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.256561 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-587ww"] Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.273563 4738 scope.go:117] "RemoveContainer" containerID="338ecded0752c2690325db46f9d6ecc7a6a65f1919097f4cfe6ca4ab6b577f18" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.292118 4738 scope.go:117] "RemoveContainer" containerID="783cd768a6c50c8f4785a3e7d7a800f9fda331155faab9f13072762afb0b11c6" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.317312 4738 scope.go:117] "RemoveContainer" containerID="fe4de9ef7bba22b9e72537403649ec12746f991179432508152f8f20e2f6fb5d" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.333738 4738 scope.go:117] "RemoveContainer" containerID="eb86e95a5c935a21690af18bdaeb2732f5ef1b762aa6c887cc26353e53e183d3" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.349590 4738 scope.go:117] "RemoveContainer" containerID="897ac86f59ba3751505c36a9676f6c3a0f804e530e6fe99db330195d77a63ea8" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.364052 4738 scope.go:117] "RemoveContainer" containerID="7ec70bde127e5efa595a79991ae933070bbc3143eb92f970d10106225c9843b0" Mar 07 07:06:51 crc kubenswrapper[4738]: I0307 07:06:51.383393 4738 scope.go:117] "RemoveContainer" containerID="971cdad0a3131bc196b1db49e70b891178c9c01db44d9bb41e01e7435211703e" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093434 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6glbr"] Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093625 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093637 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093647 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093653 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093664 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093671 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093680 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093686 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093693 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093699 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093708 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093714 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093722 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093727 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093738 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093743 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="extract-content" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093753 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093759 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093765 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093770 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093778 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093784 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="extract-utilities" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093791 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093797 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: E0307 07:06:52.093806 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093812 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093887 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093897 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093908 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093918 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" containerName="registry-server" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.093927 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" containerName="marketplace-operator" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.094571 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.097046 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.105548 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6glbr"] Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.165885 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qzsqg" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.232281 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-utilities\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.233470 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mmwz\" (UniqueName: \"kubernetes.io/projected/5c8341b7-746a-448e-92e5-b7011a75e332-kube-api-access-9mmwz\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.233607 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-catalog-content\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.289326 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.290811 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.296036 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.297515 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.334433 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mmwz\" (UniqueName: \"kubernetes.io/projected/5c8341b7-746a-448e-92e5-b7011a75e332-kube-api-access-9mmwz\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.334867 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-catalog-content\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.334902 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-utilities\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.335448 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-catalog-content\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.335486 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5c8341b7-746a-448e-92e5-b7011a75e332-utilities\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.355010 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mmwz\" (UniqueName: \"kubernetes.io/projected/5c8341b7-746a-448e-92e5-b7011a75e332-kube-api-access-9mmwz\") pod \"redhat-marketplace-6glbr\" (UID: \"5c8341b7-746a-448e-92e5-b7011a75e332\") " pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.391692 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2731d6f8-adb2-4068-a3c9-162cfcb7de07" path="/var/lib/kubelet/pods/2731d6f8-adb2-4068-a3c9-162cfcb7de07/volumes" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.392468 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ccab57d-f355-494a-adae-5a1dba9c360a" path="/var/lib/kubelet/pods/4ccab57d-f355-494a-adae-5a1dba9c360a/volumes" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.393044 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc" path="/var/lib/kubelet/pods/6ecf5a42-ecf5-4d68-8dd8-aed5322c30fc/volumes" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.393497 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b82c0da7-caec-462d-85e9-f3c45cc042b5" path="/var/lib/kubelet/pods/b82c0da7-caec-462d-85e9-f3c45cc042b5/volumes" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.394057 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f34dc9c2-f90f-473e-8dca-a3df3f70e02f" path="/var/lib/kubelet/pods/f34dc9c2-f90f-473e-8dca-a3df3f70e02f/volumes" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.421077 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.436500 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvb95\" (UniqueName: \"kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.436619 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.436655 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.537600 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.537654 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.537717 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvb95\" (UniqueName: \"kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.538303 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.538367 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.558567 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvb95\" (UniqueName: \"kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95\") pod \"community-operators-k7wrp\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.609708 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:06:52 crc kubenswrapper[4738]: I0307 07:06:52.852068 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6glbr"] Mar 07 07:06:52 crc kubenswrapper[4738]: W0307 07:06:52.854448 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c8341b7_746a_448e_92e5_b7011a75e332.slice/crio-55c07ace589c52609dad11a9d7bf3650fbf18141244398e456619f502b4baf1d WatchSource:0}: Error finding container 55c07ace589c52609dad11a9d7bf3650fbf18141244398e456619f502b4baf1d: Status 404 returned error can't find the container with id 55c07ace589c52609dad11a9d7bf3650fbf18141244398e456619f502b4baf1d Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.023072 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:06:53 crc kubenswrapper[4738]: W0307 07:06:53.084698 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e72c18f_8de1_4466_9614_d2b6b27749a3.slice/crio-27628b148e6100720b5cb58ef20fdc979b2d793fe6ab2a0401b11a689be438eb WatchSource:0}: Error finding container 27628b148e6100720b5cb58ef20fdc979b2d793fe6ab2a0401b11a689be438eb: Status 404 returned error can't find the container with id 27628b148e6100720b5cb58ef20fdc979b2d793fe6ab2a0401b11a689be438eb Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.170135 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerStarted","Data":"27628b148e6100720b5cb58ef20fdc979b2d793fe6ab2a0401b11a689be438eb"} Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.173420 4738 generic.go:334] "Generic (PLEG): container finished" podID="5c8341b7-746a-448e-92e5-b7011a75e332" containerID="ed60e6367f3cdb0c6f1435fd55bd1514c00c5c2a85fd991336ec05b7bb61c53d" exitCode=0 Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.173545 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6glbr" event={"ID":"5c8341b7-746a-448e-92e5-b7011a75e332","Type":"ContainerDied","Data":"ed60e6367f3cdb0c6f1435fd55bd1514c00c5c2a85fd991336ec05b7bb61c53d"} Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.173608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6glbr" event={"ID":"5c8341b7-746a-448e-92e5-b7011a75e332","Type":"ContainerStarted","Data":"55c07ace589c52609dad11a9d7bf3650fbf18141244398e456619f502b4baf1d"} Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.984638 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:53 crc kubenswrapper[4738]: I0307 07:06:53.985263 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" podUID="8bcdcae6-ffd9-461b-a0e1-272c93108787" containerName="controller-manager" containerID="cri-o://8578c250b4bb96ac0338c14d75d881031f5bc1c47f28f0fcc11bc1b583a820c6" gracePeriod=30 Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.180684 4738 generic.go:334] "Generic (PLEG): container finished" podID="8bcdcae6-ffd9-461b-a0e1-272c93108787" containerID="8578c250b4bb96ac0338c14d75d881031f5bc1c47f28f0fcc11bc1b583a820c6" exitCode=0 Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.180781 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" event={"ID":"8bcdcae6-ffd9-461b-a0e1-272c93108787","Type":"ContainerDied","Data":"8578c250b4bb96ac0338c14d75d881031f5bc1c47f28f0fcc11bc1b583a820c6"} Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.184048 4738 generic.go:334] "Generic (PLEG): container finished" podID="5c8341b7-746a-448e-92e5-b7011a75e332" containerID="fc4c1ad11c573b0a69fa014a54beb48e8f5ef3c2b608893330c9ac839b9de30a" exitCode=0 Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.184140 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6glbr" event={"ID":"5c8341b7-746a-448e-92e5-b7011a75e332","Type":"ContainerDied","Data":"fc4c1ad11c573b0a69fa014a54beb48e8f5ef3c2b608893330c9ac839b9de30a"} Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.187868 4738 generic.go:334] "Generic (PLEG): container finished" podID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerID="94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5" exitCode=0 Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.187917 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerDied","Data":"94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5"} Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.401044 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.464875 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g98jc\" (UniqueName: \"kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc\") pod \"8bcdcae6-ffd9-461b-a0e1-272c93108787\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.464973 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca\") pod \"8bcdcae6-ffd9-461b-a0e1-272c93108787\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.465044 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert\") pod \"8bcdcae6-ffd9-461b-a0e1-272c93108787\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.465090 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles\") pod \"8bcdcae6-ffd9-461b-a0e1-272c93108787\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.465274 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config\") pod \"8bcdcae6-ffd9-461b-a0e1-272c93108787\" (UID: \"8bcdcae6-ffd9-461b-a0e1-272c93108787\") " Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.465676 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca" (OuterVolumeSpecName: "client-ca") pod "8bcdcae6-ffd9-461b-a0e1-272c93108787" (UID: "8bcdcae6-ffd9-461b-a0e1-272c93108787"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.465991 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config" (OuterVolumeSpecName: "config") pod "8bcdcae6-ffd9-461b-a0e1-272c93108787" (UID: "8bcdcae6-ffd9-461b-a0e1-272c93108787"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.469166 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8bcdcae6-ffd9-461b-a0e1-272c93108787" (UID: "8bcdcae6-ffd9-461b-a0e1-272c93108787"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.474062 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8bcdcae6-ffd9-461b-a0e1-272c93108787" (UID: "8bcdcae6-ffd9-461b-a0e1-272c93108787"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.474244 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc" (OuterVolumeSpecName: "kube-api-access-g98jc") pod "8bcdcae6-ffd9-461b-a0e1-272c93108787" (UID: "8bcdcae6-ffd9-461b-a0e1-272c93108787"). InnerVolumeSpecName "kube-api-access-g98jc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.489084 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9t248"] Mar 07 07:06:54 crc kubenswrapper[4738]: E0307 07:06:54.489365 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bcdcae6-ffd9-461b-a0e1-272c93108787" containerName="controller-manager" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.489382 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bcdcae6-ffd9-461b-a0e1-272c93108787" containerName="controller-manager" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.489495 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bcdcae6-ffd9-461b-a0e1-272c93108787" containerName="controller-manager" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.490335 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.492285 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.506037 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9t248"] Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567222 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlwws\" (UniqueName: \"kubernetes.io/projected/3aa7e15f-67b8-491e-b579-a52873583f7f-kube-api-access-dlwws\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567323 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-utilities\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567374 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-catalog-content\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567516 4738 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567548 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g98jc\" (UniqueName: \"kubernetes.io/projected/8bcdcae6-ffd9-461b-a0e1-272c93108787-kube-api-access-g98jc\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567562 4738 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-client-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567594 4738 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8bcdcae6-ffd9-461b-a0e1-272c93108787-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.567604 4738 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8bcdcae6-ffd9-461b-a0e1-272c93108787-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.668614 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-utilities\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.668678 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-catalog-content\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.668714 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlwws\" (UniqueName: \"kubernetes.io/projected/3aa7e15f-67b8-491e-b579-a52873583f7f-kube-api-access-dlwws\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.669219 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-utilities\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.669603 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3aa7e15f-67b8-491e-b579-a52873583f7f-catalog-content\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.696667 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlwws\" (UniqueName: \"kubernetes.io/projected/3aa7e15f-67b8-491e-b579-a52873583f7f-kube-api-access-dlwws\") pod \"redhat-operators-9t248\" (UID: \"3aa7e15f-67b8-491e-b579-a52873583f7f\") " pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.716646 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qchnm"] Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.755637 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.762164 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.773126 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qchnm"] Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.863943 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.872938 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-utilities\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.872990 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8b2k\" (UniqueName: \"kubernetes.io/projected/a530d643-c7f4-4f6c-8ff7-b256b67d6764-kube-api-access-g8b2k\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.873030 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-catalog-content\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.975056 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-utilities\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.975406 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8b2k\" (UniqueName: \"kubernetes.io/projected/a530d643-c7f4-4f6c-8ff7-b256b67d6764-kube-api-access-g8b2k\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.975448 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-catalog-content\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.976443 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-catalog-content\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:54 crc kubenswrapper[4738]: I0307 07:06:54.976713 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a530d643-c7f4-4f6c-8ff7-b256b67d6764-utilities\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.005465 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8b2k\" (UniqueName: \"kubernetes.io/projected/a530d643-c7f4-4f6c-8ff7-b256b67d6764-kube-api-access-g8b2k\") pod \"certified-operators-qchnm\" (UID: \"a530d643-c7f4-4f6c-8ff7-b256b67d6764\") " pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.136713 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.195779 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" event={"ID":"8bcdcae6-ffd9-461b-a0e1-272c93108787","Type":"ContainerDied","Data":"1f9e3fac3777ec40082dec6bb0d219326b05d2e8b84e7e7357400de85c4c4011"} Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.195854 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.195864 4738 scope.go:117] "RemoveContainer" containerID="8578c250b4bb96ac0338c14d75d881031f5bc1c47f28f0fcc11bc1b583a820c6" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.201881 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6glbr" event={"ID":"5c8341b7-746a-448e-92e5-b7011a75e332","Type":"ContainerStarted","Data":"5e1244ec4a2065f7c65ad7c2366955807355f8ec51e1a18af3bb576996c3b606"} Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.204686 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerStarted","Data":"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c"} Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.220088 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6glbr" podStartSLOduration=1.743848737 podStartE2EDuration="3.220067541s" podCreationTimestamp="2026-03-07 07:06:52 +0000 UTC" firstStartedPulling="2026-03-07 07:06:53.175572237 +0000 UTC m=+431.640559568" lastFinishedPulling="2026-03-07 07:06:54.651791051 +0000 UTC m=+433.116778372" observedRunningTime="2026-03-07 07:06:55.219346321 +0000 UTC m=+433.684333652" watchObservedRunningTime="2026-03-07 07:06:55.220067541 +0000 UTC m=+433.685054862" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.258894 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.261864 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6dbd4c5b74-72gwr"] Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.290060 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9t248"] Mar 07 07:06:55 crc kubenswrapper[4738]: W0307 07:06:55.353668 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aa7e15f_67b8_491e_b579_a52873583f7f.slice/crio-865e58febe0948fdbd03cb99fb81af5ee688f3ba885939a99f7fa766866137c8 WatchSource:0}: Error finding container 865e58febe0948fdbd03cb99fb81af5ee688f3ba885939a99f7fa766866137c8: Status 404 returned error can't find the container with id 865e58febe0948fdbd03cb99fb81af5ee688f3ba885939a99f7fa766866137c8 Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.567825 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qchnm"] Mar 07 07:06:55 crc kubenswrapper[4738]: W0307 07:06:55.573947 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda530d643_c7f4_4f6c_8ff7_b256b67d6764.slice/crio-cbbaebb0ff4a3b594c2746d01ef60a40ecf5986092085bbd521460a970e675d0 WatchSource:0}: Error finding container cbbaebb0ff4a3b594c2746d01ef60a40ecf5986092085bbd521460a970e675d0: Status 404 returned error can't find the container with id cbbaebb0ff4a3b594c2746d01ef60a40ecf5986092085bbd521460a970e675d0 Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.738748 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-j57l2"] Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.739400 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.741172 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.741347 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.742524 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.742925 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.743028 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.745235 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.750728 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-j57l2"] Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.750921 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.786537 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-serving-cert\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.786591 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8kqk\" (UniqueName: \"kubernetes.io/projected/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-kube-api-access-b8kqk\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.786615 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-config\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.786650 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-proxy-ca-bundles\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.786667 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-client-ca\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.887601 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-proxy-ca-bundles\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.887880 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-client-ca\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.887981 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-serving-cert\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.888010 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8kqk\" (UniqueName: \"kubernetes.io/projected/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-kube-api-access-b8kqk\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.888037 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-config\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.888710 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-proxy-ca-bundles\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.889534 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-config\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.889539 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-client-ca\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.896265 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-serving-cert\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:55 crc kubenswrapper[4738]: I0307 07:06:55.912219 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8kqk\" (UniqueName: \"kubernetes.io/projected/b67f8b72-a2a7-4f90-a6b1-b6ba52449463-kube-api-access-b8kqk\") pod \"controller-manager-79875d7946-j57l2\" (UID: \"b67f8b72-a2a7-4f90-a6b1-b6ba52449463\") " pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.114881 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.237903 4738 generic.go:334] "Generic (PLEG): container finished" podID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerID="8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c" exitCode=0 Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.237984 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerDied","Data":"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c"} Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.241749 4738 generic.go:334] "Generic (PLEG): container finished" podID="a530d643-c7f4-4f6c-8ff7-b256b67d6764" containerID="e03a6b6b5323552319847ff09edc4eeb7b16d9a5a7895b751b08e8225b3aca29" exitCode=0 Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.241981 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qchnm" event={"ID":"a530d643-c7f4-4f6c-8ff7-b256b67d6764","Type":"ContainerDied","Data":"e03a6b6b5323552319847ff09edc4eeb7b16d9a5a7895b751b08e8225b3aca29"} Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.242151 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qchnm" event={"ID":"a530d643-c7f4-4f6c-8ff7-b256b67d6764","Type":"ContainerStarted","Data":"cbbaebb0ff4a3b594c2746d01ef60a40ecf5986092085bbd521460a970e675d0"} Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.244839 4738 generic.go:334] "Generic (PLEG): container finished" podID="3aa7e15f-67b8-491e-b579-a52873583f7f" containerID="cae23bac19e57b2957cce6543663b514d81eb942d21d9d874801a7d578e369f2" exitCode=0 Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.244969 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t248" event={"ID":"3aa7e15f-67b8-491e-b579-a52873583f7f","Type":"ContainerDied","Data":"cae23bac19e57b2957cce6543663b514d81eb942d21d9d874801a7d578e369f2"} Mar 07 07:06:56 crc kubenswrapper[4738]: I0307 07:06:56.245005 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t248" event={"ID":"3aa7e15f-67b8-491e-b579-a52873583f7f","Type":"ContainerStarted","Data":"865e58febe0948fdbd03cb99fb81af5ee688f3ba885939a99f7fa766866137c8"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:56.396051 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bcdcae6-ffd9-461b-a0e1-272c93108787" path="/var/lib/kubelet/pods/8bcdcae6-ffd9-461b-a0e1-272c93108787/volumes" Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:56.564698 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79875d7946-j57l2"] Mar 07 07:06:58 crc kubenswrapper[4738]: W0307 07:06:56.574575 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb67f8b72_a2a7_4f90_a6b1_b6ba52449463.slice/crio-9524be28868d6617d12f4d6adadeb91a06e725ab25e98cc376bce1b1abf53cf7 WatchSource:0}: Error finding container 9524be28868d6617d12f4d6adadeb91a06e725ab25e98cc376bce1b1abf53cf7: Status 404 returned error can't find the container with id 9524be28868d6617d12f4d6adadeb91a06e725ab25e98cc376bce1b1abf53cf7 Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:57.256044 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" event={"ID":"b67f8b72-a2a7-4f90-a6b1-b6ba52449463","Type":"ContainerStarted","Data":"12a0cee28fd757e142c59047b292d4464fb7796e7280259bb11eb6224b5c2721"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:57.256347 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" event={"ID":"b67f8b72-a2a7-4f90-a6b1-b6ba52449463","Type":"ContainerStarted","Data":"9524be28868d6617d12f4d6adadeb91a06e725ab25e98cc376bce1b1abf53cf7"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:57.256373 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:57.261238 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:57.277103 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79875d7946-j57l2" podStartSLOduration=4.277085791 podStartE2EDuration="4.277085791s" podCreationTimestamp="2026-03-07 07:06:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:06:57.274056869 +0000 UTC m=+435.739044190" watchObservedRunningTime="2026-03-07 07:06:57.277085791 +0000 UTC m=+435.742073112" Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:58.262479 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerStarted","Data":"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:58.264957 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qchnm" event={"ID":"a530d643-c7f4-4f6c-8ff7-b256b67d6764","Type":"ContainerStarted","Data":"2fcb176e75ba03739713e4cf126dca59a8580cf7ccbed9d1efd7fdf8922aa180"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:58.267430 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t248" event={"ID":"3aa7e15f-67b8-491e-b579-a52873583f7f","Type":"ContainerStarted","Data":"2841b9d1ad0c54a008416a9680331211ee86d43ff27471062feb6cbc32a0aebb"} Mar 07 07:06:58 crc kubenswrapper[4738]: I0307 07:06:58.290206 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k7wrp" podStartSLOduration=2.5923522009999997 podStartE2EDuration="6.290192227s" podCreationTimestamp="2026-03-07 07:06:52 +0000 UTC" firstStartedPulling="2026-03-07 07:06:54.190388839 +0000 UTC m=+432.655376160" lastFinishedPulling="2026-03-07 07:06:57.888228865 +0000 UTC m=+436.353216186" observedRunningTime="2026-03-07 07:06:58.289135809 +0000 UTC m=+436.754123130" watchObservedRunningTime="2026-03-07 07:06:58.290192227 +0000 UTC m=+436.755179548" Mar 07 07:06:59 crc kubenswrapper[4738]: I0307 07:06:59.274625 4738 generic.go:334] "Generic (PLEG): container finished" podID="a530d643-c7f4-4f6c-8ff7-b256b67d6764" containerID="2fcb176e75ba03739713e4cf126dca59a8580cf7ccbed9d1efd7fdf8922aa180" exitCode=0 Mar 07 07:06:59 crc kubenswrapper[4738]: I0307 07:06:59.274723 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qchnm" event={"ID":"a530d643-c7f4-4f6c-8ff7-b256b67d6764","Type":"ContainerDied","Data":"2fcb176e75ba03739713e4cf126dca59a8580cf7ccbed9d1efd7fdf8922aa180"} Mar 07 07:06:59 crc kubenswrapper[4738]: I0307 07:06:59.281979 4738 generic.go:334] "Generic (PLEG): container finished" podID="3aa7e15f-67b8-491e-b579-a52873583f7f" containerID="2841b9d1ad0c54a008416a9680331211ee86d43ff27471062feb6cbc32a0aebb" exitCode=0 Mar 07 07:06:59 crc kubenswrapper[4738]: I0307 07:06:59.282248 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t248" event={"ID":"3aa7e15f-67b8-491e-b579-a52873583f7f","Type":"ContainerDied","Data":"2841b9d1ad0c54a008416a9680331211ee86d43ff27471062feb6cbc32a0aebb"} Mar 07 07:07:00 crc kubenswrapper[4738]: I0307 07:07:00.291274 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qchnm" event={"ID":"a530d643-c7f4-4f6c-8ff7-b256b67d6764","Type":"ContainerStarted","Data":"dff53950894b566c31326d8922a38db08c065159348925a1c565ca84545f4e8f"} Mar 07 07:07:00 crc kubenswrapper[4738]: I0307 07:07:00.294880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9t248" event={"ID":"3aa7e15f-67b8-491e-b579-a52873583f7f","Type":"ContainerStarted","Data":"177a7fdbb0e34be4822a4b19eefec371c3a54b554d49e5ba5e910170d70dc030"} Mar 07 07:07:00 crc kubenswrapper[4738]: I0307 07:07:00.323050 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qchnm" podStartSLOduration=2.685894935 podStartE2EDuration="6.323026127s" podCreationTimestamp="2026-03-07 07:06:54 +0000 UTC" firstStartedPulling="2026-03-07 07:06:56.244190013 +0000 UTC m=+434.709177344" lastFinishedPulling="2026-03-07 07:06:59.881321195 +0000 UTC m=+438.346308536" observedRunningTime="2026-03-07 07:07:00.320343485 +0000 UTC m=+438.785330806" watchObservedRunningTime="2026-03-07 07:07:00.323026127 +0000 UTC m=+438.788013448" Mar 07 07:07:00 crc kubenswrapper[4738]: I0307 07:07:00.349355 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9t248" podStartSLOduration=2.9111171369999997 podStartE2EDuration="6.349338745s" podCreationTimestamp="2026-03-07 07:06:54 +0000 UTC" firstStartedPulling="2026-03-07 07:06:56.24634294 +0000 UTC m=+434.711330261" lastFinishedPulling="2026-03-07 07:06:59.684564548 +0000 UTC m=+438.149551869" observedRunningTime="2026-03-07 07:07:00.345926043 +0000 UTC m=+438.810913364" watchObservedRunningTime="2026-03-07 07:07:00.349338745 +0000 UTC m=+438.814326066" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.421683 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.423369 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.469569 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.610964 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.611189 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:07:02 crc kubenswrapper[4738]: I0307 07:07:02.649414 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:07:03 crc kubenswrapper[4738]: I0307 07:07:03.355781 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:07:03 crc kubenswrapper[4738]: I0307 07:07:03.357801 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6glbr" Mar 07 07:07:04 crc kubenswrapper[4738]: I0307 07:07:04.865085 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:07:04 crc kubenswrapper[4738]: I0307 07:07:04.865750 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:07:05 crc kubenswrapper[4738]: I0307 07:07:05.137519 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:07:05 crc kubenswrapper[4738]: I0307 07:07:05.137902 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:07:05 crc kubenswrapper[4738]: I0307 07:07:05.211968 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:07:05 crc kubenswrapper[4738]: I0307 07:07:05.361778 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qchnm" Mar 07 07:07:05 crc kubenswrapper[4738]: I0307 07:07:05.906720 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9t248" podUID="3aa7e15f-67b8-491e-b579-a52873583f7f" containerName="registry-server" probeResult="failure" output=< Mar 07 07:07:05 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:07:05 crc kubenswrapper[4738]: > Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.261871 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f567m"] Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.262493 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.281448 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f567m"] Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368000 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368053 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-bound-sa-token\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368077 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-trusted-ca\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368095 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6696f6a-510d-400b-8668-ab677efebbf6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368124 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-registry-certificates\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368211 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6gt7\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-kube-api-access-j6gt7\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368318 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-registry-tls\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.368345 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6696f6a-510d-400b-8668-ab677efebbf6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.402820 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473548 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-registry-tls\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473663 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6696f6a-510d-400b-8668-ab677efebbf6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473710 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-bound-sa-token\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473737 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-trusted-ca\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473759 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6696f6a-510d-400b-8668-ab677efebbf6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473803 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-registry-certificates\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.473844 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6gt7\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-kube-api-access-j6gt7\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.476532 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c6696f6a-510d-400b-8668-ab677efebbf6-ca-trust-extracted\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.476564 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-trusted-ca\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.477371 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c6696f6a-510d-400b-8668-ab677efebbf6-registry-certificates\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.485458 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c6696f6a-510d-400b-8668-ab677efebbf6-installation-pull-secrets\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.485768 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-registry-tls\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.492663 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-bound-sa-token\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.494026 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6gt7\" (UniqueName: \"kubernetes.io/projected/c6696f6a-510d-400b-8668-ab677efebbf6-kube-api-access-j6gt7\") pod \"image-registry-66df7c8f76-f567m\" (UID: \"c6696f6a-510d-400b-8668-ab677efebbf6\") " pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.578589 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:06 crc kubenswrapper[4738]: I0307 07:07:06.991365 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-f567m"] Mar 07 07:07:07 crc kubenswrapper[4738]: W0307 07:07:07.005144 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6696f6a_510d_400b_8668_ab677efebbf6.slice/crio-5a683a4fae91bb83d4ee5bfbf64ef761f4b2f7fa34929a67416ac2a8bfce8803 WatchSource:0}: Error finding container 5a683a4fae91bb83d4ee5bfbf64ef761f4b2f7fa34929a67416ac2a8bfce8803: Status 404 returned error can't find the container with id 5a683a4fae91bb83d4ee5bfbf64ef761f4b2f7fa34929a67416ac2a8bfce8803 Mar 07 07:07:07 crc kubenswrapper[4738]: I0307 07:07:07.332970 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" event={"ID":"c6696f6a-510d-400b-8668-ab677efebbf6","Type":"ContainerStarted","Data":"e6e41ec926a0482b837c56602d4282540bd091ca1646ecae2a33b5b87f26f5ae"} Mar 07 07:07:07 crc kubenswrapper[4738]: I0307 07:07:07.333529 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:07 crc kubenswrapper[4738]: I0307 07:07:07.333583 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" event={"ID":"c6696f6a-510d-400b-8668-ab677efebbf6","Type":"ContainerStarted","Data":"5a683a4fae91bb83d4ee5bfbf64ef761f4b2f7fa34929a67416ac2a8bfce8803"} Mar 07 07:07:07 crc kubenswrapper[4738]: I0307 07:07:07.360878 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" podStartSLOduration=1.360847944 podStartE2EDuration="1.360847944s" podCreationTimestamp="2026-03-07 07:07:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:07:07.357888354 +0000 UTC m=+445.822875675" watchObservedRunningTime="2026-03-07 07:07:07.360847944 +0000 UTC m=+445.825835265" Mar 07 07:07:14 crc kubenswrapper[4738]: I0307 07:07:14.915093 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:07:14 crc kubenswrapper[4738]: I0307 07:07:14.952506 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9t248" Mar 07 07:07:26 crc kubenswrapper[4738]: I0307 07:07:26.583450 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-f567m" Mar 07 07:07:26 crc kubenswrapper[4738]: I0307 07:07:26.664974 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:07:26 crc kubenswrapper[4738]: I0307 07:07:26.958448 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:07:26 crc kubenswrapper[4738]: I0307 07:07:26.958531 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:07:30 crc kubenswrapper[4738]: I0307 07:07:30.715968 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:07:30 crc kubenswrapper[4738]: I0307 07:07:30.716607 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:07:30 crc kubenswrapper[4738]: I0307 07:07:30.717240 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:07:30 crc kubenswrapper[4738]: I0307 07:07:30.729985 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:07:30 crc kubenswrapper[4738]: I0307 07:07:30.896484 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.460674 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"2a9b4bed190826a1d6d97224a9e83dc8b592389a782f4e876c23ad1541fd5018"} Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.828836 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.828925 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.835182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.835391 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:07:31 crc kubenswrapper[4738]: I0307 07:07:31.886081 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 07 07:07:32 crc kubenswrapper[4738]: I0307 07:07:32.097463 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:07:32 crc kubenswrapper[4738]: I0307 07:07:32.467135 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"a584a0a91cc6361de7c9e3f0b266a7473d866c27e4f23571974f3ec51ec9e45f"} Mar 07 07:07:32 crc kubenswrapper[4738]: I0307 07:07:32.469202 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6b5b341702c38bdf91d5022277d08a57d7177307f839652588bec6cb1e402f8a"} Mar 07 07:07:32 crc kubenswrapper[4738]: I0307 07:07:32.469224 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e4695677a16202d86478b08cf69bb542f45c2b0cb83eb7a3318b9bc15798e365"} Mar 07 07:07:32 crc kubenswrapper[4738]: W0307 07:07:32.473619 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-d77cf5626e10e21502f59653105ccdda49410f1e7313694014e382dc65474f1a WatchSource:0}: Error finding container d77cf5626e10e21502f59653105ccdda49410f1e7313694014e382dc65474f1a: Status 404 returned error can't find the container with id d77cf5626e10e21502f59653105ccdda49410f1e7313694014e382dc65474f1a Mar 07 07:07:33 crc kubenswrapper[4738]: I0307 07:07:33.475110 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a645cefb789f1f833b68454bb9f152f0650f1128334f42fb253698b6fc185c1d"} Mar 07 07:07:33 crc kubenswrapper[4738]: I0307 07:07:33.475835 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d77cf5626e10e21502f59653105ccdda49410f1e7313694014e382dc65474f1a"} Mar 07 07:07:33 crc kubenswrapper[4738]: I0307 07:07:33.476086 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:07:51 crc kubenswrapper[4738]: I0307 07:07:51.700756 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" podUID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" containerName="registry" containerID="cri-o://1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e" gracePeriod=30 Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.202280 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297286 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297343 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297395 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b28zr\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297431 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297488 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297507 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297539 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.297558 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca\") pod \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\" (UID: \"4e5277c1-ff6d-49c3-9443-19d8f98cae68\") " Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.298557 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.299211 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.304505 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.304613 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.306535 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr" (OuterVolumeSpecName: "kube-api-access-b28zr") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "kube-api-access-b28zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.306702 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.311260 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.321814 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4e5277c1-ff6d-49c3-9443-19d8f98cae68" (UID: "4e5277c1-ff6d-49c3-9443-19d8f98cae68"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399604 4738 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4e5277c1-ff6d-49c3-9443-19d8f98cae68-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399641 4738 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399652 4738 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399661 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b28zr\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-kube-api-access-b28zr\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399669 4738 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399678 4738 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4e5277c1-ff6d-49c3-9443-19d8f98cae68-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.399686 4738 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4e5277c1-ff6d-49c3-9443-19d8f98cae68-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.606841 4738 generic.go:334] "Generic (PLEG): container finished" podID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" containerID="1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e" exitCode=0 Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.606937 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" event={"ID":"4e5277c1-ff6d-49c3-9443-19d8f98cae68","Type":"ContainerDied","Data":"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e"} Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.607042 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" event={"ID":"4e5277c1-ff6d-49c3-9443-19d8f98cae68","Type":"ContainerDied","Data":"58abc2d40c524324709678519ec00b0553712c5bbc077046b857459dd9cafe07"} Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.607105 4738 scope.go:117] "RemoveContainer" containerID="1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.607782 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-gsfcg" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.641465 4738 scope.go:117] "RemoveContainer" containerID="1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e" Mar 07 07:07:52 crc kubenswrapper[4738]: E0307 07:07:52.642667 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e\": container with ID starting with 1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e not found: ID does not exist" containerID="1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.642715 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e"} err="failed to get container status \"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e\": rpc error: code = NotFound desc = could not find container \"1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e\": container with ID starting with 1e3b1455af9bf2a71eccd710da195555868a28e65e06914bb847bf544bb5688e not found: ID does not exist" Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.652046 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:07:52 crc kubenswrapper[4738]: I0307 07:07:52.662602 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-gsfcg"] Mar 07 07:07:54 crc kubenswrapper[4738]: I0307 07:07:54.396078 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" path="/var/lib/kubelet/pods/4e5277c1-ff6d-49c3-9443-19d8f98cae68/volumes" Mar 07 07:07:56 crc kubenswrapper[4738]: I0307 07:07:56.957961 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:07:56 crc kubenswrapper[4738]: I0307 07:07:56.958555 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.146014 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547788-s4dz4"] Mar 07 07:08:00 crc kubenswrapper[4738]: E0307 07:08:00.146237 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" containerName="registry" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.146248 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" containerName="registry" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.146336 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5277c1-ff6d-49c3-9443-19d8f98cae68" containerName="registry" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.147337 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.149387 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.150116 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.150150 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.161046 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-s4dz4"] Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.203618 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9lz\" (UniqueName: \"kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz\") pod \"auto-csr-approver-29547788-s4dz4\" (UID: \"6eeba904-9ec3-46e1-803d-d3c4fc286879\") " pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.304741 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9lz\" (UniqueName: \"kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz\") pod \"auto-csr-approver-29547788-s4dz4\" (UID: \"6eeba904-9ec3-46e1-803d-d3c4fc286879\") " pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.326937 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9lz\" (UniqueName: \"kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz\") pod \"auto-csr-approver-29547788-s4dz4\" (UID: \"6eeba904-9ec3-46e1-803d-d3c4fc286879\") " pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.504793 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:00 crc kubenswrapper[4738]: I0307 07:08:00.924803 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-s4dz4"] Mar 07 07:08:01 crc kubenswrapper[4738]: I0307 07:08:01.667332 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" event={"ID":"6eeba904-9ec3-46e1-803d-d3c4fc286879","Type":"ContainerStarted","Data":"b8d5c5c4dee92fdeb11a72d33ca5d6155bbab3db8f9942caa720382a732ac10e"} Mar 07 07:08:02 crc kubenswrapper[4738]: I0307 07:08:02.104557 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 07 07:08:02 crc kubenswrapper[4738]: I0307 07:08:02.680640 4738 generic.go:334] "Generic (PLEG): container finished" podID="6eeba904-9ec3-46e1-803d-d3c4fc286879" containerID="b1be626607c2a7a38c792c624e09efeecef7f88bb38aa1f7524c723c15aad2ec" exitCode=0 Mar 07 07:08:02 crc kubenswrapper[4738]: I0307 07:08:02.680720 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" event={"ID":"6eeba904-9ec3-46e1-803d-d3c4fc286879","Type":"ContainerDied","Data":"b1be626607c2a7a38c792c624e09efeecef7f88bb38aa1f7524c723c15aad2ec"} Mar 07 07:08:03 crc kubenswrapper[4738]: I0307 07:08:03.981348 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.053471 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt9lz\" (UniqueName: \"kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz\") pod \"6eeba904-9ec3-46e1-803d-d3c4fc286879\" (UID: \"6eeba904-9ec3-46e1-803d-d3c4fc286879\") " Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.062432 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz" (OuterVolumeSpecName: "kube-api-access-kt9lz") pod "6eeba904-9ec3-46e1-803d-d3c4fc286879" (UID: "6eeba904-9ec3-46e1-803d-d3c4fc286879"). InnerVolumeSpecName "kube-api-access-kt9lz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.154651 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt9lz\" (UniqueName: \"kubernetes.io/projected/6eeba904-9ec3-46e1-803d-d3c4fc286879-kube-api-access-kt9lz\") on node \"crc\" DevicePath \"\"" Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.700727 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" event={"ID":"6eeba904-9ec3-46e1-803d-d3c4fc286879","Type":"ContainerDied","Data":"b8d5c5c4dee92fdeb11a72d33ca5d6155bbab3db8f9942caa720382a732ac10e"} Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.700791 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8d5c5c4dee92fdeb11a72d33ca5d6155bbab3db8f9942caa720382a732ac10e" Mar 07 07:08:04 crc kubenswrapper[4738]: I0307 07:08:04.700872 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547788-s4dz4" Mar 07 07:08:05 crc kubenswrapper[4738]: I0307 07:08:05.039452 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-mrbc5"] Mar 07 07:08:05 crc kubenswrapper[4738]: I0307 07:08:05.043854 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547782-mrbc5"] Mar 07 07:08:06 crc kubenswrapper[4738]: I0307 07:08:06.395105 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09f978c5-7fd4-4852-95c4-915304c1bf18" path="/var/lib/kubelet/pods/09f978c5-7fd4-4852-95c4-915304c1bf18/volumes" Mar 07 07:08:26 crc kubenswrapper[4738]: I0307 07:08:26.958292 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:08:26 crc kubenswrapper[4738]: I0307 07:08:26.959085 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:08:26 crc kubenswrapper[4738]: I0307 07:08:26.959201 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:08:26 crc kubenswrapper[4738]: I0307 07:08:26.960004 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:08:26 crc kubenswrapper[4738]: I0307 07:08:26.960090 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df" gracePeriod=600 Mar 07 07:08:27 crc kubenswrapper[4738]: I0307 07:08:27.857471 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df" exitCode=0 Mar 07 07:08:27 crc kubenswrapper[4738]: I0307 07:08:27.857553 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df"} Mar 07 07:08:27 crc kubenswrapper[4738]: I0307 07:08:27.858490 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17"} Mar 07 07:08:27 crc kubenswrapper[4738]: I0307 07:08:27.858528 4738 scope.go:117] "RemoveContainer" containerID="6686bb75ec61d2366c90bdd12be9fac7b166c389fab7afae68a9fa902670abc3" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.151454 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547790-lp9mz"] Mar 07 07:10:00 crc kubenswrapper[4738]: E0307 07:10:00.152431 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6eeba904-9ec3-46e1-803d-d3c4fc286879" containerName="oc" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.152451 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6eeba904-9ec3-46e1-803d-d3c4fc286879" containerName="oc" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.152639 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6eeba904-9ec3-46e1-803d-d3c4fc286879" containerName="oc" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.154018 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.161540 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.161805 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.167872 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.169622 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-lp9mz"] Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.265875 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrjnv\" (UniqueName: \"kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv\") pod \"auto-csr-approver-29547790-lp9mz\" (UID: \"d93a749f-62f3-4014-9e01-dad1c1225fc1\") " pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.367256 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrjnv\" (UniqueName: \"kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv\") pod \"auto-csr-approver-29547790-lp9mz\" (UID: \"d93a749f-62f3-4014-9e01-dad1c1225fc1\") " pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.398950 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrjnv\" (UniqueName: \"kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv\") pod \"auto-csr-approver-29547790-lp9mz\" (UID: \"d93a749f-62f3-4014-9e01-dad1c1225fc1\") " pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.482569 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.690607 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-lp9mz"] Mar 07 07:10:00 crc kubenswrapper[4738]: I0307 07:10:00.699339 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:10:01 crc kubenswrapper[4738]: I0307 07:10:01.497016 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" event={"ID":"d93a749f-62f3-4014-9e01-dad1c1225fc1","Type":"ContainerStarted","Data":"72e7e991cc2051315f4adeffdc426df41ae49115b58c6565754796ab50e86c97"} Mar 07 07:10:02 crc kubenswrapper[4738]: I0307 07:10:02.504028 4738 generic.go:334] "Generic (PLEG): container finished" podID="d93a749f-62f3-4014-9e01-dad1c1225fc1" containerID="4fbdf0ecdd2715af02dc4218a4dfdda6a1324de3b79d4d9faead8fb2c675d9bf" exitCode=0 Mar 07 07:10:02 crc kubenswrapper[4738]: I0307 07:10:02.504133 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" event={"ID":"d93a749f-62f3-4014-9e01-dad1c1225fc1","Type":"ContainerDied","Data":"4fbdf0ecdd2715af02dc4218a4dfdda6a1324de3b79d4d9faead8fb2c675d9bf"} Mar 07 07:10:03 crc kubenswrapper[4738]: I0307 07:10:03.747770 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:03 crc kubenswrapper[4738]: I0307 07:10:03.913015 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrjnv\" (UniqueName: \"kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv\") pod \"d93a749f-62f3-4014-9e01-dad1c1225fc1\" (UID: \"d93a749f-62f3-4014-9e01-dad1c1225fc1\") " Mar 07 07:10:03 crc kubenswrapper[4738]: I0307 07:10:03.922285 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv" (OuterVolumeSpecName: "kube-api-access-jrjnv") pod "d93a749f-62f3-4014-9e01-dad1c1225fc1" (UID: "d93a749f-62f3-4014-9e01-dad1c1225fc1"). InnerVolumeSpecName "kube-api-access-jrjnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.015921 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrjnv\" (UniqueName: \"kubernetes.io/projected/d93a749f-62f3-4014-9e01-dad1c1225fc1-kube-api-access-jrjnv\") on node \"crc\" DevicePath \"\"" Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.524040 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" event={"ID":"d93a749f-62f3-4014-9e01-dad1c1225fc1","Type":"ContainerDied","Data":"72e7e991cc2051315f4adeffdc426df41ae49115b58c6565754796ab50e86c97"} Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.524103 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72e7e991cc2051315f4adeffdc426df41ae49115b58c6565754796ab50e86c97" Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.524227 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547790-lp9mz" Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.808208 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-vqg4n"] Mar 07 07:10:04 crc kubenswrapper[4738]: I0307 07:10:04.812053 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547784-vqg4n"] Mar 07 07:10:06 crc kubenswrapper[4738]: I0307 07:10:06.398026 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a41584d7-5e09-4e4b-9b1d-6aa4b7d13039" path="/var/lib/kubelet/pods/a41584d7-5e09-4e4b-9b1d-6aa4b7d13039/volumes" Mar 07 07:10:43 crc kubenswrapper[4738]: I0307 07:10:43.510414 4738 scope.go:117] "RemoveContainer" containerID="fca17f0eff9dfb92737a52be4b29ea4279eb260fc2eec54cab232cb730e635af" Mar 07 07:10:56 crc kubenswrapper[4738]: I0307 07:10:56.958470 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:10:56 crc kubenswrapper[4738]: I0307 07:10:56.959125 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:11:26 crc kubenswrapper[4738]: I0307 07:11:26.957546 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:11:26 crc kubenswrapper[4738]: I0307 07:11:26.958360 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:11:43 crc kubenswrapper[4738]: I0307 07:11:43.569227 4738 scope.go:117] "RemoveContainer" containerID="872d29a6f1ca6bde82c01c7da420f4a8dcad68c5b40f2d47a1be8a69e4b721e6" Mar 07 07:11:56 crc kubenswrapper[4738]: I0307 07:11:56.958143 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:11:56 crc kubenswrapper[4738]: I0307 07:11:56.958999 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:11:56 crc kubenswrapper[4738]: I0307 07:11:56.959063 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:11:56 crc kubenswrapper[4738]: I0307 07:11:56.959944 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:11:56 crc kubenswrapper[4738]: I0307 07:11:56.960042 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17" gracePeriod=600 Mar 07 07:11:57 crc kubenswrapper[4738]: I0307 07:11:57.343313 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17" exitCode=0 Mar 07 07:11:57 crc kubenswrapper[4738]: I0307 07:11:57.343670 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17"} Mar 07 07:11:57 crc kubenswrapper[4738]: I0307 07:11:57.343700 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7"} Mar 07 07:11:57 crc kubenswrapper[4738]: I0307 07:11:57.343716 4738 scope.go:117] "RemoveContainer" containerID="f2215638c9459ede86496c7b956ec07b5b9ff65e16a9b539256ee476459590df" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.139345 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547792-2dlw8"] Mar 07 07:12:00 crc kubenswrapper[4738]: E0307 07:12:00.140917 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93a749f-62f3-4014-9e01-dad1c1225fc1" containerName="oc" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.141037 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93a749f-62f3-4014-9e01-dad1c1225fc1" containerName="oc" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.141266 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93a749f-62f3-4014-9e01-dad1c1225fc1" containerName="oc" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.141795 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.143872 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.144130 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.144998 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.147268 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-2dlw8"] Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.158298 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8q4f\" (UniqueName: \"kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f\") pod \"auto-csr-approver-29547792-2dlw8\" (UID: \"5b20b3db-801f-4ee6-8026-c6f32666a798\") " pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.259772 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8q4f\" (UniqueName: \"kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f\") pod \"auto-csr-approver-29547792-2dlw8\" (UID: \"5b20b3db-801f-4ee6-8026-c6f32666a798\") " pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.281514 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8q4f\" (UniqueName: \"kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f\") pod \"auto-csr-approver-29547792-2dlw8\" (UID: \"5b20b3db-801f-4ee6-8026-c6f32666a798\") " pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.472757 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:00 crc kubenswrapper[4738]: I0307 07:12:00.696365 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-2dlw8"] Mar 07 07:12:00 crc kubenswrapper[4738]: W0307 07:12:00.704784 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b20b3db_801f_4ee6_8026_c6f32666a798.slice/crio-5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c WatchSource:0}: Error finding container 5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c: Status 404 returned error can't find the container with id 5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c Mar 07 07:12:01 crc kubenswrapper[4738]: I0307 07:12:01.378291 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" event={"ID":"5b20b3db-801f-4ee6-8026-c6f32666a798","Type":"ContainerStarted","Data":"5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c"} Mar 07 07:12:02 crc kubenswrapper[4738]: I0307 07:12:02.385876 4738 generic.go:334] "Generic (PLEG): container finished" podID="5b20b3db-801f-4ee6-8026-c6f32666a798" containerID="f4f63b30ce670e3b9dbd6410ebe2380cbec74d33da37f92238589e03f79eb043" exitCode=0 Mar 07 07:12:02 crc kubenswrapper[4738]: I0307 07:12:02.395501 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" event={"ID":"5b20b3db-801f-4ee6-8026-c6f32666a798","Type":"ContainerDied","Data":"f4f63b30ce670e3b9dbd6410ebe2380cbec74d33da37f92238589e03f79eb043"} Mar 07 07:12:03 crc kubenswrapper[4738]: I0307 07:12:03.621891 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:03 crc kubenswrapper[4738]: I0307 07:12:03.811296 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8q4f\" (UniqueName: \"kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f\") pod \"5b20b3db-801f-4ee6-8026-c6f32666a798\" (UID: \"5b20b3db-801f-4ee6-8026-c6f32666a798\") " Mar 07 07:12:03 crc kubenswrapper[4738]: I0307 07:12:03.825692 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f" (OuterVolumeSpecName: "kube-api-access-h8q4f") pod "5b20b3db-801f-4ee6-8026-c6f32666a798" (UID: "5b20b3db-801f-4ee6-8026-c6f32666a798"). InnerVolumeSpecName "kube-api-access-h8q4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:03 crc kubenswrapper[4738]: I0307 07:12:03.913122 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8q4f\" (UniqueName: \"kubernetes.io/projected/5b20b3db-801f-4ee6-8026-c6f32666a798-kube-api-access-h8q4f\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:04 crc kubenswrapper[4738]: I0307 07:12:04.400529 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" event={"ID":"5b20b3db-801f-4ee6-8026-c6f32666a798","Type":"ContainerDied","Data":"5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c"} Mar 07 07:12:04 crc kubenswrapper[4738]: I0307 07:12:04.400581 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5251bcce50954d4ac1adb414dc05e27547d9b06c8681db87e55ae0f65ae4ef7c" Mar 07 07:12:04 crc kubenswrapper[4738]: I0307 07:12:04.400647 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547792-2dlw8" Mar 07 07:12:04 crc kubenswrapper[4738]: I0307 07:12:04.695786 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-66p2d"] Mar 07 07:12:04 crc kubenswrapper[4738]: I0307 07:12:04.701929 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547786-66p2d"] Mar 07 07:12:06 crc kubenswrapper[4738]: I0307 07:12:06.397646 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4" path="/var/lib/kubelet/pods/b58b35ca-ec6c-4d04-bcd2-22b5d66cf9c4/volumes" Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.719543 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh7s7"] Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721421 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-controller" containerID="cri-o://0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721529 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="northd" containerID="cri-o://50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721602 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721630 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-acl-logging" containerID="cri-o://da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721695 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="sbdb" containerID="cri-o://2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721589 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-node" containerID="cri-o://4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.721756 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="nbdb" containerID="cri-o://d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" gracePeriod=30 Mar 07 07:12:14 crc kubenswrapper[4738]: I0307 07:12:14.799412 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" containerID="cri-o://0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" gracePeriod=30 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.092514 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/3.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.100530 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovn-acl-logging/0.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.101251 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovn-controller/0.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.101924 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.152758 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-zdrck"] Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.152989 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.153009 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.153020 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-acl-logging" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154758 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-acl-logging" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154776 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kubecfg-setup" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154783 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kubecfg-setup" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154796 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154805 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154821 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154829 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154838 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="sbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154845 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="sbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154857 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b20b3db-801f-4ee6-8026-c6f32666a798" containerName="oc" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154864 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b20b3db-801f-4ee6-8026-c6f32666a798" containerName="oc" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154875 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154883 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154893 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-node" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154900 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-node" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.154907 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="northd" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.154914 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="northd" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.155566 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155582 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.155592 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="nbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155599 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="nbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155765 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="northd" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155780 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="sbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155790 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155799 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155811 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-acl-logging" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155823 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155832 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovn-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155841 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155849 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="kube-rbac-proxy-node" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155860 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b20b3db-801f-4ee6-8026-c6f32666a798" containerName="oc" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155872 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="nbdb" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.155988 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.155999 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.156111 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.156254 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.156267 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.156403 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerName="ovnkube-controller" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.160264 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273602 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273675 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273706 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273737 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273765 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v46jv\" (UniqueName: \"kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273780 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273796 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273854 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273879 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273907 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273936 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273935 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log" (OuterVolumeSpecName: "node-log") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.273966 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274007 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274032 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274059 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274084 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274105 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274138 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274179 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274203 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274243 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config\") pod \"0e3f9734-9fb5-4b90-9268-888bc377406e\" (UID: \"0e3f9734-9fb5-4b90-9268-888bc377406e\") " Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274270 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274365 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274402 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash" (OuterVolumeSpecName: "host-slash") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274401 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274441 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274453 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274449 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274512 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274484 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274495 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket" (OuterVolumeSpecName: "log-socket") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274648 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274705 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovn-node-metrics-cert\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274740 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-var-lib-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274764 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-systemd-units\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274823 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-node-log\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274837 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274847 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-netd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274948 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.274975 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-bin\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275059 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275193 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-etc-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275262 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-log-socket\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275293 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz7dg\" (UniqueName: \"kubernetes.io/projected/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-kube-api-access-tz7dg\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275320 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275323 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275349 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-slash\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275507 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-kubelet\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275619 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-config\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275657 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-systemd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275689 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275725 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-ovn\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275761 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-netns\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275794 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-script-lib\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275832 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-env-overrides\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275942 4738 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275967 4738 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.275988 4738 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276006 4738 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276026 4738 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276045 4738 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-node-log\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276062 4738 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276079 4738 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276097 4738 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276118 4738 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276135 4738 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-slash\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276151 4738 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-log-socket\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276203 4738 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0e3f9734-9fb5-4b90-9268-888bc377406e-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276220 4738 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276236 4738 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276253 4738 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.276290 4738 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.282218 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.282329 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv" (OuterVolumeSpecName: "kube-api-access-v46jv") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "kube-api-access-v46jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.290529 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "0e3f9734-9fb5-4b90-9268-888bc377406e" (UID: "0e3f9734-9fb5-4b90-9268-888bc377406e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377781 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-script-lib\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377836 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-env-overrides\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377862 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovn-node-metrics-cert\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377900 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-var-lib-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377925 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-systemd-units\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377972 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-node-log\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.377998 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-netd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378031 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378053 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-bin\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378076 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-etc-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378109 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-log-socket\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378137 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz7dg\" (UniqueName: \"kubernetes.io/projected/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-kube-api-access-tz7dg\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378189 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378212 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-slash\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378235 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-kubelet\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378261 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-config\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378285 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-systemd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378304 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378326 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-ovn\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378346 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-netns\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378375 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-bin\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378393 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v46jv\" (UniqueName: \"kubernetes.io/projected/0e3f9734-9fb5-4b90-9268-888bc377406e-kube-api-access-v46jv\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378782 4738 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0e3f9734-9fb5-4b90-9268-888bc377406e-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378435 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-netns\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378808 4738 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0e3f9734-9fb5-4b90-9268-888bc377406e-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378459 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-etc-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378482 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-log-socket\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378491 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-systemd-units\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378513 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-node-log\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378561 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-var-lib-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378617 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-run-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378635 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378674 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-systemd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378672 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-kubelet\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378475 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-cni-netd\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378797 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-openvswitch\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.378554 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-host-slash\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.379397 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-env-overrides\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.379586 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-run-ovn\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.380032 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-config\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.380672 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovnkube-script-lib\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.385572 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-ovn-node-metrics-cert\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.406244 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz7dg\" (UniqueName: \"kubernetes.io/projected/e3cda3db-6079-4bb8-850b-ff3b85feb9cc-kube-api-access-tz7dg\") pod \"ovnkube-node-zdrck\" (UID: \"e3cda3db-6079-4bb8-850b-ff3b85feb9cc\") " pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.490573 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovnkube-controller/3.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.492641 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.493662 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovn-acl-logging/0.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494234 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jh7s7_0e3f9734-9fb5-4b90-9268-888bc377406e/ovn-controller/0.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494637 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494676 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494689 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494698 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494708 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494718 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" exitCode=0 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494728 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" exitCode=143 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494737 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e3f9734-9fb5-4b90-9268-888bc377406e" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" exitCode=143 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494747 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494799 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494832 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494846 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494856 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494867 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494879 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494890 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494900 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494906 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494912 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494917 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494923 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494929 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494934 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494940 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494946 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494953 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494959 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494964 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494970 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494975 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494981 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494987 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494993 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.494999 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495006 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495015 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495027 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495036 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495044 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495052 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495059 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495067 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495073 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495078 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495083 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495089 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495097 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jh7s7" event={"ID":"0e3f9734-9fb5-4b90-9268-888bc377406e","Type":"ContainerDied","Data":"07e7e0f5700de1ac15b11facbe5e897760face580efdb4957e9a0b59babde84d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495107 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495115 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495124 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495130 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495135 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495140 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495145 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495171 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495185 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495192 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.495209 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.497570 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/2.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.498172 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/1.log" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.498202 4738 generic.go:334] "Generic (PLEG): container finished" podID="c0a91659-d53f-4694-82a7-8c66445ab4f5" containerID="3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345" exitCode=2 Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.498224 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerDied","Data":"3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.498240 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5"} Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.498890 4738 scope.go:117] "RemoveContainer" containerID="3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.499086 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5)\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.523023 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.548533 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh7s7"] Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.554499 4738 scope.go:117] "RemoveContainer" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.556818 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jh7s7"] Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.580480 4738 scope.go:117] "RemoveContainer" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.601122 4738 scope.go:117] "RemoveContainer" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.628961 4738 scope.go:117] "RemoveContainer" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.656299 4738 scope.go:117] "RemoveContainer" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.675086 4738 scope.go:117] "RemoveContainer" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.702131 4738 scope.go:117] "RemoveContainer" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.773128 4738 scope.go:117] "RemoveContainer" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.793723 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.794321 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.794367 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} err="failed to get container status \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.794482 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.794989 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": container with ID starting with 168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e not found: ID does not exist" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.795028 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} err="failed to get container status \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": rpc error: code = NotFound desc = could not find container \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": container with ID starting with 168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.795047 4738 scope.go:117] "RemoveContainer" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.795787 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": container with ID starting with 2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8 not found: ID does not exist" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.795832 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} err="failed to get container status \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": rpc error: code = NotFound desc = could not find container \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": container with ID starting with 2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.795860 4738 scope.go:117] "RemoveContainer" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.796381 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": container with ID starting with d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153 not found: ID does not exist" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.796481 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} err="failed to get container status \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": rpc error: code = NotFound desc = could not find container \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": container with ID starting with d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.796526 4738 scope.go:117] "RemoveContainer" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.797094 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": container with ID starting with 50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41 not found: ID does not exist" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.797195 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} err="failed to get container status \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": rpc error: code = NotFound desc = could not find container \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": container with ID starting with 50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.797243 4738 scope.go:117] "RemoveContainer" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.797913 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": container with ID starting with e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044 not found: ID does not exist" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.797955 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} err="failed to get container status \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": rpc error: code = NotFound desc = could not find container \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": container with ID starting with e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.797978 4738 scope.go:117] "RemoveContainer" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.798535 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": container with ID starting with 4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4 not found: ID does not exist" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.798572 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} err="failed to get container status \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": rpc error: code = NotFound desc = could not find container \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": container with ID starting with 4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.798621 4738 scope.go:117] "RemoveContainer" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.799007 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": container with ID starting with da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609 not found: ID does not exist" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799040 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} err="failed to get container status \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": rpc error: code = NotFound desc = could not find container \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": container with ID starting with da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799058 4738 scope.go:117] "RemoveContainer" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.799449 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": container with ID starting with 0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d not found: ID does not exist" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799489 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} err="failed to get container status \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": rpc error: code = NotFound desc = could not find container \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": container with ID starting with 0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799511 4738 scope.go:117] "RemoveContainer" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: E0307 07:12:15.799930 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": container with ID starting with 0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb not found: ID does not exist" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799963 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} err="failed to get container status \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": rpc error: code = NotFound desc = could not find container \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": container with ID starting with 0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.799982 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800290 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} err="failed to get container status \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800316 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800616 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} err="failed to get container status \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": rpc error: code = NotFound desc = could not find container \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": container with ID starting with 168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800637 4738 scope.go:117] "RemoveContainer" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800945 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} err="failed to get container status \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": rpc error: code = NotFound desc = could not find container \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": container with ID starting with 2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.800967 4738 scope.go:117] "RemoveContainer" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801282 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} err="failed to get container status \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": rpc error: code = NotFound desc = could not find container \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": container with ID starting with d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801301 4738 scope.go:117] "RemoveContainer" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801588 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} err="failed to get container status \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": rpc error: code = NotFound desc = could not find container \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": container with ID starting with 50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801607 4738 scope.go:117] "RemoveContainer" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801949 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} err="failed to get container status \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": rpc error: code = NotFound desc = could not find container \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": container with ID starting with e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.801972 4738 scope.go:117] "RemoveContainer" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.802308 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} err="failed to get container status \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": rpc error: code = NotFound desc = could not find container \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": container with ID starting with 4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.802357 4738 scope.go:117] "RemoveContainer" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.802645 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} err="failed to get container status \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": rpc error: code = NotFound desc = could not find container \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": container with ID starting with da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.802677 4738 scope.go:117] "RemoveContainer" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803050 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} err="failed to get container status \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": rpc error: code = NotFound desc = could not find container \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": container with ID starting with 0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803068 4738 scope.go:117] "RemoveContainer" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803366 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} err="failed to get container status \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": rpc error: code = NotFound desc = could not find container \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": container with ID starting with 0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803390 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803790 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} err="failed to get container status \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.803832 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.804226 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} err="failed to get container status \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": rpc error: code = NotFound desc = could not find container \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": container with ID starting with 168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.804246 4738 scope.go:117] "RemoveContainer" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.804691 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} err="failed to get container status \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": rpc error: code = NotFound desc = could not find container \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": container with ID starting with 2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.804718 4738 scope.go:117] "RemoveContainer" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805028 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} err="failed to get container status \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": rpc error: code = NotFound desc = could not find container \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": container with ID starting with d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805049 4738 scope.go:117] "RemoveContainer" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805425 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} err="failed to get container status \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": rpc error: code = NotFound desc = could not find container \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": container with ID starting with 50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805471 4738 scope.go:117] "RemoveContainer" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805770 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} err="failed to get container status \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": rpc error: code = NotFound desc = could not find container \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": container with ID starting with e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.805793 4738 scope.go:117] "RemoveContainer" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806056 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} err="failed to get container status \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": rpc error: code = NotFound desc = could not find container \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": container with ID starting with 4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806073 4738 scope.go:117] "RemoveContainer" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806432 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} err="failed to get container status \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": rpc error: code = NotFound desc = could not find container \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": container with ID starting with da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806465 4738 scope.go:117] "RemoveContainer" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806852 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} err="failed to get container status \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": rpc error: code = NotFound desc = could not find container \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": container with ID starting with 0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.806883 4738 scope.go:117] "RemoveContainer" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.807205 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} err="failed to get container status \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": rpc error: code = NotFound desc = could not find container \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": container with ID starting with 0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.807228 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.807574 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} err="failed to get container status \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.807607 4738 scope.go:117] "RemoveContainer" containerID="168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808058 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e"} err="failed to get container status \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": rpc error: code = NotFound desc = could not find container \"168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e\": container with ID starting with 168a37fa803072f5d4cb38c5a5edb6ccae6eda9dc9b7a8eb87bd80bbaf4b132e not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808078 4738 scope.go:117] "RemoveContainer" containerID="2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808433 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8"} err="failed to get container status \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": rpc error: code = NotFound desc = could not find container \"2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8\": container with ID starting with 2e02da3eb1ca0e8acd81097efddd4ddb785b3e329c68d21372e9e6634be78dc8 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808464 4738 scope.go:117] "RemoveContainer" containerID="d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808804 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153"} err="failed to get container status \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": rpc error: code = NotFound desc = could not find container \"d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153\": container with ID starting with d84b69b70e9b68039b8147d6efd9e995171255cd3aff98c495e6699e35a00153 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.808825 4738 scope.go:117] "RemoveContainer" containerID="50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809212 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41"} err="failed to get container status \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": rpc error: code = NotFound desc = could not find container \"50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41\": container with ID starting with 50f7a631124612edba70af0b1cf4d790ae34b4b001a87fac4a9c1ba8a6486a41 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809236 4738 scope.go:117] "RemoveContainer" containerID="e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809518 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044"} err="failed to get container status \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": rpc error: code = NotFound desc = could not find container \"e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044\": container with ID starting with e48148dcd1d9201cb56bdf0420670b75f1d5cabf9d8ea11445cc9d705342d044 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809538 4738 scope.go:117] "RemoveContainer" containerID="4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809912 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4"} err="failed to get container status \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": rpc error: code = NotFound desc = could not find container \"4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4\": container with ID starting with 4199e5ce75b732673d5bde9465dd1849136066ffa8460cd394ea5259537518b4 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.809944 4738 scope.go:117] "RemoveContainer" containerID="da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.810307 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609"} err="failed to get container status \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": rpc error: code = NotFound desc = could not find container \"da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609\": container with ID starting with da7fd087567f853f71360201ec8442eeed0913b5f0c9e361ecbd460238223609 not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.810329 4738 scope.go:117] "RemoveContainer" containerID="0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.810690 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d"} err="failed to get container status \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": rpc error: code = NotFound desc = could not find container \"0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d\": container with ID starting with 0e90a105d33c9f409fbe9e9dd9feccf58b45dcafb267af55aa504a830b493e6d not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.810707 4738 scope.go:117] "RemoveContainer" containerID="0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.811168 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb"} err="failed to get container status \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": rpc error: code = NotFound desc = could not find container \"0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb\": container with ID starting with 0ab334407f5fe8d5ab265d00fe90edcc7b721a1bd776206d69dcff0c6c5c5abb not found: ID does not exist" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.811192 4738 scope.go:117] "RemoveContainer" containerID="0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51" Mar 07 07:12:15 crc kubenswrapper[4738]: I0307 07:12:15.811570 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51"} err="failed to get container status \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": rpc error: code = NotFound desc = could not find container \"0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51\": container with ID starting with 0fb4cd5c8c3b62c3e0b7eddd4a01feece5cf1c9436e5725f2856cf3d1f881b51 not found: ID does not exist" Mar 07 07:12:16 crc kubenswrapper[4738]: I0307 07:12:16.392626 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e3f9734-9fb5-4b90-9268-888bc377406e" path="/var/lib/kubelet/pods/0e3f9734-9fb5-4b90-9268-888bc377406e/volumes" Mar 07 07:12:16 crc kubenswrapper[4738]: I0307 07:12:16.510997 4738 generic.go:334] "Generic (PLEG): container finished" podID="e3cda3db-6079-4bb8-850b-ff3b85feb9cc" containerID="9b2d3dc376551efe38bb606bc8baa977d940677c6f9273fd0460268724ef39cf" exitCode=0 Mar 07 07:12:16 crc kubenswrapper[4738]: I0307 07:12:16.511073 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerDied","Data":"9b2d3dc376551efe38bb606bc8baa977d940677c6f9273fd0460268724ef39cf"} Mar 07 07:12:16 crc kubenswrapper[4738]: I0307 07:12:16.511104 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"170751493a6e91a70345e14f11d4c34b2def703eca91039db70536c4c2875d08"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.525643 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"e7cc2f3cad7cadca385b0b9f64597741af8af3ba1a2b75299455972273a3c6e2"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.526483 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"a8f2cda58e587449006d70888d27368241c6a72c214608966796a8dec68f60f2"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.526504 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"f89bf67a9825bccc54fd9d36c9737ff5004daafcbd1838b1b7c8e0baa593b3b5"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.526523 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"c5c2afcd315d4dda7c5d636cfbee306cc8e31e2d0f9ae397d65dab7fed2330bc"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.526541 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"1efddd53f6d90a14e3e01dbf53f735cab4609725d2abada6cf6c64c6e14192e7"} Mar 07 07:12:17 crc kubenswrapper[4738]: I0307 07:12:17.526559 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"374377abf9055f0db5b8a62b26731b3b7eb772c441de58f6fb208fdcdcb18f65"} Mar 07 07:12:20 crc kubenswrapper[4738]: I0307 07:12:20.554697 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"0317f0a288d73aa513471b53066856563d7893b8823ecb1b9d4d27324816a7b0"} Mar 07 07:12:22 crc kubenswrapper[4738]: I0307 07:12:22.570814 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" event={"ID":"e3cda3db-6079-4bb8-850b-ff3b85feb9cc","Type":"ContainerStarted","Data":"cfd29a16c6ba06427906f00914b2f7747c3c3990c6e263b68e9664c37d1da17b"} Mar 07 07:12:22 crc kubenswrapper[4738]: I0307 07:12:22.571201 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:22 crc kubenswrapper[4738]: I0307 07:12:22.606136 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" podStartSLOduration=7.606117552 podStartE2EDuration="7.606117552s" podCreationTimestamp="2026-03-07 07:12:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:12:22.600820468 +0000 UTC m=+761.065807809" watchObservedRunningTime="2026-03-07 07:12:22.606117552 +0000 UTC m=+761.071104893" Mar 07 07:12:22 crc kubenswrapper[4738]: I0307 07:12:22.656408 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:23 crc kubenswrapper[4738]: I0307 07:12:23.578606 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:23 crc kubenswrapper[4738]: I0307 07:12:23.579094 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:23 crc kubenswrapper[4738]: I0307 07:12:23.620455 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:28 crc kubenswrapper[4738]: I0307 07:12:28.387444 4738 scope.go:117] "RemoveContainer" containerID="3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345" Mar 07 07:12:28 crc kubenswrapper[4738]: E0307 07:12:28.388125 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-54cnw_openshift-multus(c0a91659-d53f-4694-82a7-8c66445ab4f5)\"" pod="openshift-multus/multus-54cnw" podUID="c0a91659-d53f-4694-82a7-8c66445ab4f5" Mar 07 07:12:39 crc kubenswrapper[4738]: I0307 07:12:39.386141 4738 scope.go:117] "RemoveContainer" containerID="3ee44a78672d8a355b0dbe707d77bfb8475b2f769e671f6359e5720aaae3c345" Mar 07 07:12:39 crc kubenswrapper[4738]: I0307 07:12:39.691638 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/2.log" Mar 07 07:12:39 crc kubenswrapper[4738]: I0307 07:12:39.692635 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/1.log" Mar 07 07:12:39 crc kubenswrapper[4738]: I0307 07:12:39.692697 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-54cnw" event={"ID":"c0a91659-d53f-4694-82a7-8c66445ab4f5","Type":"ContainerStarted","Data":"af7c8d96e10421bd4da70a9ee0e0e2ffbad8b6c551cfad120df05ecfd5d35e2d"} Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.537412 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg"] Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.539020 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.542692 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.550796 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg"] Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.658549 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.658920 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.659223 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5gnf\" (UniqueName: \"kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.761114 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.761220 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5gnf\" (UniqueName: \"kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.761267 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.761826 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.761904 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.794392 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5gnf\" (UniqueName: \"kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:41 crc kubenswrapper[4738]: I0307 07:12:41.862985 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:42 crc kubenswrapper[4738]: I0307 07:12:42.078103 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg"] Mar 07 07:12:42 crc kubenswrapper[4738]: W0307 07:12:42.083218 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3d9bf10_cf58_4926_9c16_fe8e0f322287.slice/crio-33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f WatchSource:0}: Error finding container 33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f: Status 404 returned error can't find the container with id 33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f Mar 07 07:12:42 crc kubenswrapper[4738]: I0307 07:12:42.712222 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerID="4964a332b4add2a05974956ff7249f52b649133783a1e35bca8b775a6576e1b5" exitCode=0 Mar 07 07:12:42 crc kubenswrapper[4738]: I0307 07:12:42.712294 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerDied","Data":"4964a332b4add2a05974956ff7249f52b649133783a1e35bca8b775a6576e1b5"} Mar 07 07:12:42 crc kubenswrapper[4738]: I0307 07:12:42.712338 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerStarted","Data":"33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f"} Mar 07 07:12:43 crc kubenswrapper[4738]: I0307 07:12:43.658349 4738 scope.go:117] "RemoveContainer" containerID="b22ad71b7f687dd90c5dd50584cc04290b1dfe21234cb408d282b7612dfb18f0" Mar 07 07:12:43 crc kubenswrapper[4738]: I0307 07:12:43.708187 4738 scope.go:117] "RemoveContainer" containerID="13d0b1b16f53e9e23fe6bbed5082f4f749c4f91f743d1b8400e0a81e8b1944c5" Mar 07 07:12:44 crc kubenswrapper[4738]: I0307 07:12:44.733810 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-54cnw_c0a91659-d53f-4694-82a7-8c66445ab4f5/kube-multus/2.log" Mar 07 07:12:44 crc kubenswrapper[4738]: I0307 07:12:44.737348 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerID="ec2e139491abf28afc1af9948d57d5be6c04a6694da22f20606d19c830da59ac" exitCode=0 Mar 07 07:12:44 crc kubenswrapper[4738]: I0307 07:12:44.737422 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerDied","Data":"ec2e139491abf28afc1af9948d57d5be6c04a6694da22f20606d19c830da59ac"} Mar 07 07:12:45 crc kubenswrapper[4738]: I0307 07:12:45.524918 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-zdrck" Mar 07 07:12:45 crc kubenswrapper[4738]: I0307 07:12:45.759204 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerStarted","Data":"8578b2561a8af9deee1d438f8da9e7479a1c01e5b713037c75042c26e7edac0d"} Mar 07 07:12:45 crc kubenswrapper[4738]: I0307 07:12:45.781990 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" podStartSLOduration=3.455245922 podStartE2EDuration="4.781970823s" podCreationTimestamp="2026-03-07 07:12:41 +0000 UTC" firstStartedPulling="2026-03-07 07:12:42.714863955 +0000 UTC m=+781.179851296" lastFinishedPulling="2026-03-07 07:12:44.041588836 +0000 UTC m=+782.506576197" observedRunningTime="2026-03-07 07:12:45.778533031 +0000 UTC m=+784.243520402" watchObservedRunningTime="2026-03-07 07:12:45.781970823 +0000 UTC m=+784.246958154" Mar 07 07:12:46 crc kubenswrapper[4738]: I0307 07:12:46.769474 4738 generic.go:334] "Generic (PLEG): container finished" podID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerID="8578b2561a8af9deee1d438f8da9e7479a1c01e5b713037c75042c26e7edac0d" exitCode=0 Mar 07 07:12:46 crc kubenswrapper[4738]: I0307 07:12:46.769549 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerDied","Data":"8578b2561a8af9deee1d438f8da9e7479a1c01e5b713037c75042c26e7edac0d"} Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.143544 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.258964 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5gnf\" (UniqueName: \"kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf\") pod \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.259081 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle\") pod \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.259117 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util\") pod \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\" (UID: \"b3d9bf10-cf58-4926-9c16-fe8e0f322287\") " Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.261495 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle" (OuterVolumeSpecName: "bundle") pod "b3d9bf10-cf58-4926-9c16-fe8e0f322287" (UID: "b3d9bf10-cf58-4926-9c16-fe8e0f322287"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.269069 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf" (OuterVolumeSpecName: "kube-api-access-c5gnf") pod "b3d9bf10-cf58-4926-9c16-fe8e0f322287" (UID: "b3d9bf10-cf58-4926-9c16-fe8e0f322287"). InnerVolumeSpecName "kube-api-access-c5gnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.270986 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util" (OuterVolumeSpecName: "util") pod "b3d9bf10-cf58-4926-9c16-fe8e0f322287" (UID: "b3d9bf10-cf58-4926-9c16-fe8e0f322287"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.360757 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5gnf\" (UniqueName: \"kubernetes.io/projected/b3d9bf10-cf58-4926-9c16-fe8e0f322287-kube-api-access-c5gnf\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.360872 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.360892 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b3d9bf10-cf58-4926-9c16-fe8e0f322287-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.791353 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" event={"ID":"b3d9bf10-cf58-4926-9c16-fe8e0f322287","Type":"ContainerDied","Data":"33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f"} Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.791426 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33ddad896809374409d7a8ad6cdcf459017e20e66c329782e931427dddaf891f" Mar 07 07:12:48 crc kubenswrapper[4738]: I0307 07:12:48.791486 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.479718 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97"] Mar 07 07:12:56 crc kubenswrapper[4738]: E0307 07:12:56.480510 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="extract" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.480523 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="extract" Mar 07 07:12:56 crc kubenswrapper[4738]: E0307 07:12:56.480533 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="util" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.480538 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="util" Mar 07 07:12:56 crc kubenswrapper[4738]: E0307 07:12:56.480550 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="pull" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.480556 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="pull" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.480650 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3d9bf10-cf58-4926-9c16-fe8e0f322287" containerName="extract" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.481041 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.485408 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.485765 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.485805 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.489301 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-sr5zv" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.489584 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.511011 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97"] Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.580883 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkzgg\" (UniqueName: \"kubernetes.io/projected/756069cb-b586-444a-9f5a-48b3316677c2-kube-api-access-hkzgg\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.580936 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-webhook-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.581027 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-apiservice-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.687415 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-apiservice-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.687564 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkzgg\" (UniqueName: \"kubernetes.io/projected/756069cb-b586-444a-9f5a-48b3316677c2-kube-api-access-hkzgg\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.687610 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-webhook-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.694457 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-webhook-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.698817 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/756069cb-b586-444a-9f5a-48b3316677c2-apiservice-cert\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.711596 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkzgg\" (UniqueName: \"kubernetes.io/projected/756069cb-b586-444a-9f5a-48b3316677c2-kube-api-access-hkzgg\") pod \"metallb-operator-controller-manager-5b66fc78fd-rzn97\" (UID: \"756069cb-b586-444a-9f5a-48b3316677c2\") " pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.798743 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.800616 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq"] Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.801593 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.804755 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-92p8t" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.805189 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.805425 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.883074 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq"] Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.890141 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jj5db\" (UniqueName: \"kubernetes.io/projected/e109b736-b857-43fd-821e-7f6f71490c99-kube-api-access-jj5db\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.890244 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-apiservice-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.890278 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-webhook-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.995862 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-apiservice-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.996283 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-webhook-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:56 crc kubenswrapper[4738]: I0307 07:12:56.996348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jj5db\" (UniqueName: \"kubernetes.io/projected/e109b736-b857-43fd-821e-7f6f71490c99-kube-api-access-jj5db\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.001673 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-apiservice-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.002285 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e109b736-b857-43fd-821e-7f6f71490c99-webhook-cert\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.013756 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jj5db\" (UniqueName: \"kubernetes.io/projected/e109b736-b857-43fd-821e-7f6f71490c99-kube-api-access-jj5db\") pod \"metallb-operator-webhook-server-78584b5797-gwhrq\" (UID: \"e109b736-b857-43fd-821e-7f6f71490c99\") " pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.072547 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97"] Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.166609 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.379647 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq"] Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.859114 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" event={"ID":"e109b736-b857-43fd-821e-7f6f71490c99","Type":"ContainerStarted","Data":"8d75c4a67c00b779a16caedd174b22f61558a7c75414f193c15e0b9bb5da8562"} Mar 07 07:12:57 crc kubenswrapper[4738]: I0307 07:12:57.860918 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" event={"ID":"756069cb-b586-444a-9f5a-48b3316677c2","Type":"ContainerStarted","Data":"ac6c9836788dd2139242f79d63395f2befc0f54831c689639fabe1a6adc88f4d"} Mar 07 07:13:02 crc kubenswrapper[4738]: I0307 07:13:02.933583 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" event={"ID":"756069cb-b586-444a-9f5a-48b3316677c2","Type":"ContainerStarted","Data":"5dedc5eb10dd15aa1466e2c07670b4c5948cf23d27ae9e36130011a9be2bac27"} Mar 07 07:13:02 crc kubenswrapper[4738]: I0307 07:13:02.934262 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:13:02 crc kubenswrapper[4738]: I0307 07:13:02.935297 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" event={"ID":"e109b736-b857-43fd-821e-7f6f71490c99","Type":"ContainerStarted","Data":"495956f4a0099a6dc90838ea9f4193731427d06e91bd2a5ac9cbf7a2ca5879c0"} Mar 07 07:13:02 crc kubenswrapper[4738]: I0307 07:13:02.935561 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:13:02 crc kubenswrapper[4738]: I0307 07:13:02.983398 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" podStartSLOduration=1.919451384 podStartE2EDuration="6.983376859s" podCreationTimestamp="2026-03-07 07:12:56 +0000 UTC" firstStartedPulling="2026-03-07 07:12:57.08814166 +0000 UTC m=+795.553128981" lastFinishedPulling="2026-03-07 07:13:02.152067115 +0000 UTC m=+800.617054456" observedRunningTime="2026-03-07 07:13:02.980896323 +0000 UTC m=+801.445883644" watchObservedRunningTime="2026-03-07 07:13:02.983376859 +0000 UTC m=+801.448364190" Mar 07 07:13:03 crc kubenswrapper[4738]: I0307 07:13:03.013561 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" podStartSLOduration=2.189459419 podStartE2EDuration="7.013541816s" podCreationTimestamp="2026-03-07 07:12:56 +0000 UTC" firstStartedPulling="2026-03-07 07:12:57.390057617 +0000 UTC m=+795.855044938" lastFinishedPulling="2026-03-07 07:13:02.214140014 +0000 UTC m=+800.679127335" observedRunningTime="2026-03-07 07:13:03.013120574 +0000 UTC m=+801.478107895" watchObservedRunningTime="2026-03-07 07:13:03.013541816 +0000 UTC m=+801.478529137" Mar 07 07:13:17 crc kubenswrapper[4738]: I0307 07:13:17.171610 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-78584b5797-gwhrq" Mar 07 07:13:36 crc kubenswrapper[4738]: I0307 07:13:36.803144 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b66fc78fd-rzn97" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.547476 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-n9czq"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.550521 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.552636 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-nmltp" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.552726 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.552747 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.561013 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.561955 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.563882 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.575184 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583104 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5l7b\" (UniqueName: \"kubernetes.io/projected/69c55926-37e2-41f4-aaac-8d6aeacbfb47-kube-api-access-c5l7b\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583192 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-conf\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583248 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583300 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-sockets\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583357 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583420 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wkclm\" (UniqueName: \"kubernetes.io/projected/0623b7d4-baca-4ea1-a095-551939ddc05f-kube-api-access-wkclm\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583493 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-reloader\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583517 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics-certs\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.583551 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-startup\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.642945 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vpfpk"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.644074 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.647130 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-rtspq"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.647213 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.647440 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-ns6sk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.648706 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.652145 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.653988 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.654827 4738 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693399 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wkclm\" (UniqueName: \"kubernetes.io/projected/0623b7d4-baca-4ea1-a095-551939ddc05f-kube-api-access-wkclm\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693453 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-reloader\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693472 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics-certs\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693494 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-startup\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693527 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5l7b\" (UniqueName: \"kubernetes.io/projected/69c55926-37e2-41f4-aaac-8d6aeacbfb47-kube-api-access-c5l7b\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693548 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-conf\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693567 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693589 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-sockets\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.693607 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.694473 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-startup\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.694813 4738 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.694866 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert podName:0623b7d4-baca-4ea1-a095-551939ddc05f nodeName:}" failed. No retries permitted until 2026-03-07 07:13:38.194850972 +0000 UTC m=+836.659838283 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert") pod "frr-k8s-webhook-server-7f989f654f-2dvtg" (UID: "0623b7d4-baca-4ea1-a095-551939ddc05f") : secret "frr-k8s-webhook-server-cert" not found Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.694933 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-reloader\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.695108 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-sockets\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.695366 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.695423 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/69c55926-37e2-41f4-aaac-8d6aeacbfb47-frr-conf\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.703086 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/69c55926-37e2-41f4-aaac-8d6aeacbfb47-metrics-certs\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.707586 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtspq"] Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.721429 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wkclm\" (UniqueName: \"kubernetes.io/projected/0623b7d4-baca-4ea1-a095-551939ddc05f-kube-api-access-wkclm\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.721821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5l7b\" (UniqueName: \"kubernetes.io/projected/69c55926-37e2-41f4-aaac-8d6aeacbfb47-kube-api-access-c5l7b\") pod \"frr-k8s-n9czq\" (UID: \"69c55926-37e2-41f4-aaac-8d6aeacbfb47\") " pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-cert\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794554 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t4fk\" (UniqueName: \"kubernetes.io/projected/161f12dc-536f-41ca-89b8-4feca285a795-kube-api-access-4t4fk\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794620 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/161f12dc-536f-41ca-89b8-4feca285a795-metallb-excludel2\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794678 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcwl\" (UniqueName: \"kubernetes.io/projected/59fb62c4-aece-4eae-99d3-92872dc1c4de-kube-api-access-9dcwl\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794704 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-metrics-certs\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794740 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.794756 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.880284 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897667 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897714 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897754 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-cert\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897785 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4t4fk\" (UniqueName: \"kubernetes.io/projected/161f12dc-536f-41ca-89b8-4feca285a795-kube-api-access-4t4fk\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897806 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/161f12dc-536f-41ca-89b8-4feca285a795-metallb-excludel2\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897833 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcwl\" (UniqueName: \"kubernetes.io/projected/59fb62c4-aece-4eae-99d3-92872dc1c4de-kube-api-access-9dcwl\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.897853 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-metrics-certs\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.897886 4738 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.897968 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist podName:161f12dc-536f-41ca-89b8-4feca285a795 nodeName:}" failed. No retries permitted until 2026-03-07 07:13:38.397942898 +0000 UTC m=+836.862930259 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist") pod "speaker-vpfpk" (UID: "161f12dc-536f-41ca-89b8-4feca285a795") : secret "metallb-memberlist" not found Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.898498 4738 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 07 07:13:37 crc kubenswrapper[4738]: E0307 07:13:37.898651 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs podName:59fb62c4-aece-4eae-99d3-92872dc1c4de nodeName:}" failed. No retries permitted until 2026-03-07 07:13:38.398617747 +0000 UTC m=+836.863605078 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs") pod "controller-86ddb6bd46-rtspq" (UID: "59fb62c4-aece-4eae-99d3-92872dc1c4de") : secret "controller-certs-secret" not found Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.899591 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/161f12dc-536f-41ca-89b8-4feca285a795-metallb-excludel2\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.900730 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-metrics-certs\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.903125 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-cert\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.926798 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t4fk\" (UniqueName: \"kubernetes.io/projected/161f12dc-536f-41ca-89b8-4feca285a795-kube-api-access-4t4fk\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:37 crc kubenswrapper[4738]: I0307 07:13:37.932464 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcwl\" (UniqueName: \"kubernetes.io/projected/59fb62c4-aece-4eae-99d3-92872dc1c4de-kube-api-access-9dcwl\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.162076 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"e4eb7a6e7b34d6137fdc522d7d862fc34257aed6c866a3324cf4bb5680d392f6"} Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.202470 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.210015 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0623b7d4-baca-4ea1-a095-551939ddc05f-cert\") pod \"frr-k8s-webhook-server-7f989f654f-2dvtg\" (UID: \"0623b7d4-baca-4ea1-a095-551939ddc05f\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.404057 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.404137 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:38 crc kubenswrapper[4738]: E0307 07:13:38.405247 4738 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 07:13:38 crc kubenswrapper[4738]: E0307 07:13:38.405333 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist podName:161f12dc-536f-41ca-89b8-4feca285a795 nodeName:}" failed. No retries permitted until 2026-03-07 07:13:39.405307277 +0000 UTC m=+837.870294638 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist") pod "speaker-vpfpk" (UID: "161f12dc-536f-41ca-89b8-4feca285a795") : secret "metallb-memberlist" not found Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.409287 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/59fb62c4-aece-4eae-99d3-92872dc1c4de-metrics-certs\") pod \"controller-86ddb6bd46-rtspq\" (UID: \"59fb62c4-aece-4eae-99d3-92872dc1c4de\") " pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.487616 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.595877 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.824295 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg"] Mar 07 07:13:38 crc kubenswrapper[4738]: I0307 07:13:38.885586 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-rtspq"] Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.170979 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" event={"ID":"0623b7d4-baca-4ea1-a095-551939ddc05f","Type":"ContainerStarted","Data":"58e084429605d9f254617f44fc8e47c74c6aee22a11261ab988cdf115ed83641"} Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.172939 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtspq" event={"ID":"59fb62c4-aece-4eae-99d3-92872dc1c4de","Type":"ContainerStarted","Data":"4061935ddc546f95c23ce1787c66c27a73a60dccf142c53386c74103de17e0d9"} Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.173018 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtspq" event={"ID":"59fb62c4-aece-4eae-99d3-92872dc1c4de","Type":"ContainerStarted","Data":"742281fbd742551186ddd36b73a200c199cad94112c6bc96823cf0916b79952f"} Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.419260 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.425689 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/161f12dc-536f-41ca-89b8-4feca285a795-memberlist\") pod \"speaker-vpfpk\" (UID: \"161f12dc-536f-41ca-89b8-4feca285a795\") " pod="metallb-system/speaker-vpfpk" Mar 07 07:13:39 crc kubenswrapper[4738]: I0307 07:13:39.479039 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vpfpk" Mar 07 07:13:40 crc kubenswrapper[4738]: I0307 07:13:40.188671 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpfpk" event={"ID":"161f12dc-536f-41ca-89b8-4feca285a795","Type":"ContainerStarted","Data":"4ee871d459a79dce7f254d1cb20a5cf1ee8635b5a2005d03ab403d9ecc2d3fa7"} Mar 07 07:13:40 crc kubenswrapper[4738]: I0307 07:13:40.188956 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpfpk" event={"ID":"161f12dc-536f-41ca-89b8-4feca285a795","Type":"ContainerStarted","Data":"065b144f78e77a1fbfe0891ee0eee6230c2359f8dd171fb699f64d3013182b8e"} Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.211479 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vpfpk" event={"ID":"161f12dc-536f-41ca-89b8-4feca285a795","Type":"ContainerStarted","Data":"27d5a979c4ca39819298904858631e3c3a6d14dcd70000f07c1c602e96a48765"} Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.212203 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vpfpk" Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.213421 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-rtspq" event={"ID":"59fb62c4-aece-4eae-99d3-92872dc1c4de","Type":"ContainerStarted","Data":"9a4e65170d911b098a87c0d79429eddbb1dd5301b6c094f0b664add86d67f136"} Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.213566 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.231298 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vpfpk" podStartSLOduration=3.402277428 podStartE2EDuration="6.231277643s" podCreationTimestamp="2026-03-07 07:13:37 +0000 UTC" firstStartedPulling="2026-03-07 07:13:39.827238022 +0000 UTC m=+838.292225343" lastFinishedPulling="2026-03-07 07:13:42.656238237 +0000 UTC m=+841.121225558" observedRunningTime="2026-03-07 07:13:43.230362079 +0000 UTC m=+841.695349400" watchObservedRunningTime="2026-03-07 07:13:43.231277643 +0000 UTC m=+841.696264964" Mar 07 07:13:43 crc kubenswrapper[4738]: I0307 07:13:43.259778 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-rtspq" podStartSLOduration=2.6774852 podStartE2EDuration="6.259748994s" podCreationTimestamp="2026-03-07 07:13:37 +0000 UTC" firstStartedPulling="2026-03-07 07:13:39.048917284 +0000 UTC m=+837.513904595" lastFinishedPulling="2026-03-07 07:13:42.631181058 +0000 UTC m=+841.096168389" observedRunningTime="2026-03-07 07:13:43.25472042 +0000 UTC m=+841.719707741" watchObservedRunningTime="2026-03-07 07:13:43.259748994 +0000 UTC m=+841.724736315" Mar 07 07:13:46 crc kubenswrapper[4738]: I0307 07:13:46.248240 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" event={"ID":"0623b7d4-baca-4ea1-a095-551939ddc05f","Type":"ContainerStarted","Data":"52bb0ae0be1114cb863481c2c4987b2ab534c479469dc9080f74c1309c08be83"} Mar 07 07:13:46 crc kubenswrapper[4738]: I0307 07:13:46.248847 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:46 crc kubenswrapper[4738]: I0307 07:13:46.251213 4738 generic.go:334] "Generic (PLEG): container finished" podID="69c55926-37e2-41f4-aaac-8d6aeacbfb47" containerID="42166fe6f922f5597fbcbbfe3ec8c74141a423fff9803fafa0a0889885861621" exitCode=0 Mar 07 07:13:46 crc kubenswrapper[4738]: I0307 07:13:46.251252 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerDied","Data":"42166fe6f922f5597fbcbbfe3ec8c74141a423fff9803fafa0a0889885861621"} Mar 07 07:13:46 crc kubenswrapper[4738]: I0307 07:13:46.278974 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" podStartSLOduration=2.49526223 podStartE2EDuration="9.278948871s" podCreationTimestamp="2026-03-07 07:13:37 +0000 UTC" firstStartedPulling="2026-03-07 07:13:38.835185003 +0000 UTC m=+837.300172334" lastFinishedPulling="2026-03-07 07:13:45.618871654 +0000 UTC m=+844.083858975" observedRunningTime="2026-03-07 07:13:46.271364718 +0000 UTC m=+844.736352049" watchObservedRunningTime="2026-03-07 07:13:46.278948871 +0000 UTC m=+844.743936212" Mar 07 07:13:47 crc kubenswrapper[4738]: I0307 07:13:47.262391 4738 generic.go:334] "Generic (PLEG): container finished" podID="69c55926-37e2-41f4-aaac-8d6aeacbfb47" containerID="7e9ec0ac989eff854f31e6c188faeed0838cc233e33f88b2758da243fc78cf01" exitCode=0 Mar 07 07:13:47 crc kubenswrapper[4738]: I0307 07:13:47.262459 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerDied","Data":"7e9ec0ac989eff854f31e6c188faeed0838cc233e33f88b2758da243fc78cf01"} Mar 07 07:13:48 crc kubenswrapper[4738]: I0307 07:13:48.273131 4738 generic.go:334] "Generic (PLEG): container finished" podID="69c55926-37e2-41f4-aaac-8d6aeacbfb47" containerID="fa03f586f1bc03d2d7f25f634b65406c8542d81d5472e1ee9ad2a9464b23ccad" exitCode=0 Mar 07 07:13:48 crc kubenswrapper[4738]: I0307 07:13:48.273192 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerDied","Data":"fa03f586f1bc03d2d7f25f634b65406c8542d81d5472e1ee9ad2a9464b23ccad"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.288150 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"02a7fe6e521574c931153999cc37d40e7bf4067888c263e84206c70f7ad5d389"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.288240 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"3198b22ddceb8be06f9f7b514f30e75693b0f0e6b8c55c03cef2ffc77e1c3c54"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.288255 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"a86a391fe23cb6fd01c7c6da514c1a4d88633303402af5744955c58a527e9287"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.288267 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"c42cc159d7de8d76900b417f212feee2394cf20497d7946b0293cb72feff6285"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.288280 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"878b39b9acd399331edf1dfac2f5a64f812289062c03cc240ab01efb92f37ed4"} Mar 07 07:13:49 crc kubenswrapper[4738]: I0307 07:13:49.483224 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vpfpk" Mar 07 07:13:50 crc kubenswrapper[4738]: I0307 07:13:50.306140 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-n9czq" event={"ID":"69c55926-37e2-41f4-aaac-8d6aeacbfb47","Type":"ContainerStarted","Data":"99a5e5279e80fd648e4ae49d4fb17ad1b3a9baf0607f22fb8e546d8a0ea475a1"} Mar 07 07:13:51 crc kubenswrapper[4738]: I0307 07:13:51.314029 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:52 crc kubenswrapper[4738]: I0307 07:13:52.881261 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:52 crc kubenswrapper[4738]: I0307 07:13:52.923818 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:13:52 crc kubenswrapper[4738]: I0307 07:13:52.954513 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-n9czq" podStartSLOduration=8.454614864 podStartE2EDuration="15.954496672s" podCreationTimestamp="2026-03-07 07:13:37 +0000 UTC" firstStartedPulling="2026-03-07 07:13:38.096823073 +0000 UTC m=+836.561810394" lastFinishedPulling="2026-03-07 07:13:45.596704881 +0000 UTC m=+844.061692202" observedRunningTime="2026-03-07 07:13:50.336025562 +0000 UTC m=+848.801012893" watchObservedRunningTime="2026-03-07 07:13:52.954496672 +0000 UTC m=+851.419484003" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.205001 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.206466 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.211299 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-ntv2z" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.211703 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.213669 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.259691 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmkn6\" (UniqueName: \"kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6\") pod \"mariadb-operator-index-xbxmx\" (UID: \"1462b25e-1b95-4595-8d75-6c79da8b9d5a\") " pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.300557 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.361824 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmkn6\" (UniqueName: \"kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6\") pod \"mariadb-operator-index-xbxmx\" (UID: \"1462b25e-1b95-4595-8d75-6c79da8b9d5a\") " pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.391534 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmkn6\" (UniqueName: \"kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6\") pod \"mariadb-operator-index-xbxmx\" (UID: \"1462b25e-1b95-4595-8d75-6c79da8b9d5a\") " pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.563066 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:13:55 crc kubenswrapper[4738]: I0307 07:13:55.989543 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:13:56 crc kubenswrapper[4738]: W0307 07:13:56.003327 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1462b25e_1b95_4595_8d75_6c79da8b9d5a.slice/crio-30f310365266a289a6404e7647ce0fa15af62b0c4f7da9fa63d98f931053caf9 WatchSource:0}: Error finding container 30f310365266a289a6404e7647ce0fa15af62b0c4f7da9fa63d98f931053caf9: Status 404 returned error can't find the container with id 30f310365266a289a6404e7647ce0fa15af62b0c4f7da9fa63d98f931053caf9 Mar 07 07:13:56 crc kubenswrapper[4738]: I0307 07:13:56.353981 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xbxmx" event={"ID":"1462b25e-1b95-4595-8d75-6c79da8b9d5a","Type":"ContainerStarted","Data":"30f310365266a289a6404e7647ce0fa15af62b0c4f7da9fa63d98f931053caf9"} Mar 07 07:13:57 crc kubenswrapper[4738]: I0307 07:13:57.383346 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xbxmx" event={"ID":"1462b25e-1b95-4595-8d75-6c79da8b9d5a","Type":"ContainerStarted","Data":"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc"} Mar 07 07:13:57 crc kubenswrapper[4738]: I0307 07:13:57.424642 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-xbxmx" podStartSLOduration=1.702252837 podStartE2EDuration="2.424587509s" podCreationTimestamp="2026-03-07 07:13:55 +0000 UTC" firstStartedPulling="2026-03-07 07:13:56.007054201 +0000 UTC m=+854.472041532" lastFinishedPulling="2026-03-07 07:13:56.729388883 +0000 UTC m=+855.194376204" observedRunningTime="2026-03-07 07:13:57.41529553 +0000 UTC m=+855.880282851" watchObservedRunningTime="2026-03-07 07:13:57.424587509 +0000 UTC m=+855.889574840" Mar 07 07:13:58 crc kubenswrapper[4738]: I0307 07:13:58.499110 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-2dvtg" Mar 07 07:13:58 crc kubenswrapper[4738]: I0307 07:13:58.599735 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:13:58 crc kubenswrapper[4738]: I0307 07:13:58.602574 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-rtspq" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.179378 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-qzgzb"] Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.180455 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.202473 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-qzgzb"] Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.226334 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqf8\" (UniqueName: \"kubernetes.io/projected/c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9-kube-api-access-kmqf8\") pod \"mariadb-operator-index-qzgzb\" (UID: \"c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9\") " pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.328341 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqf8\" (UniqueName: \"kubernetes.io/projected/c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9-kube-api-access-kmqf8\") pod \"mariadb-operator-index-qzgzb\" (UID: \"c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9\") " pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.355533 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqf8\" (UniqueName: \"kubernetes.io/projected/c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9-kube-api-access-kmqf8\") pod \"mariadb-operator-index-qzgzb\" (UID: \"c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9\") " pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.395691 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-xbxmx" podUID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" containerName="registry-server" containerID="cri-o://0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc" gracePeriod=2 Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.543084 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:13:59 crc kubenswrapper[4738]: I0307 07:13:59.753713 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-qzgzb"] Mar 07 07:13:59 crc kubenswrapper[4738]: W0307 07:13:59.989440 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6ada68d_3a9c_41c5_a8a4_1abe8cf757b9.slice/crio-7c69f7ee89893aa17d4e2178c9fa55fb3beb0bdd4c8798309dfe51c293fba3f2 WatchSource:0}: Error finding container 7c69f7ee89893aa17d4e2178c9fa55fb3beb0bdd4c8798309dfe51c293fba3f2: Status 404 returned error can't find the container with id 7c69f7ee89893aa17d4e2178c9fa55fb3beb0bdd4c8798309dfe51c293fba3f2 Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.140542 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547794-gr7mp"] Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.141953 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.145751 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.146096 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-gr7mp"] Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.146582 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.147307 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.240478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrzwm\" (UniqueName: \"kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm\") pod \"auto-csr-approver-29547794-gr7mp\" (UID: \"37661a52-49a3-4a00-abde-d7a211efde35\") " pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.311941 4738 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.342616 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrzwm\" (UniqueName: \"kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm\") pod \"auto-csr-approver-29547794-gr7mp\" (UID: \"37661a52-49a3-4a00-abde-d7a211efde35\") " pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.362360 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrzwm\" (UniqueName: \"kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm\") pod \"auto-csr-approver-29547794-gr7mp\" (UID: \"37661a52-49a3-4a00-abde-d7a211efde35\") " pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.382228 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.405831 4738 generic.go:334] "Generic (PLEG): container finished" podID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" containerID="0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc" exitCode=0 Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.405884 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-xbxmx" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.405926 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xbxmx" event={"ID":"1462b25e-1b95-4595-8d75-6c79da8b9d5a","Type":"ContainerDied","Data":"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc"} Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.406019 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-xbxmx" event={"ID":"1462b25e-1b95-4595-8d75-6c79da8b9d5a","Type":"ContainerDied","Data":"30f310365266a289a6404e7647ce0fa15af62b0c4f7da9fa63d98f931053caf9"} Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.406058 4738 scope.go:117] "RemoveContainer" containerID="0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.407193 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qzgzb" event={"ID":"c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9","Type":"ContainerStarted","Data":"7c69f7ee89893aa17d4e2178c9fa55fb3beb0bdd4c8798309dfe51c293fba3f2"} Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.435756 4738 scope.go:117] "RemoveContainer" containerID="0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc" Mar 07 07:14:00 crc kubenswrapper[4738]: E0307 07:14:00.436392 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc\": container with ID starting with 0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc not found: ID does not exist" containerID="0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.436443 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc"} err="failed to get container status \"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc\": rpc error: code = NotFound desc = could not find container \"0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc\": container with ID starting with 0cd6591791e25129d10e5890e365d5910df704979c659d09cfa859f1f66116cc not found: ID does not exist" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.444128 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmkn6\" (UniqueName: \"kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6\") pod \"1462b25e-1b95-4595-8d75-6c79da8b9d5a\" (UID: \"1462b25e-1b95-4595-8d75-6c79da8b9d5a\") " Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.460878 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6" (OuterVolumeSpecName: "kube-api-access-kmkn6") pod "1462b25e-1b95-4595-8d75-6c79da8b9d5a" (UID: "1462b25e-1b95-4595-8d75-6c79da8b9d5a"). InnerVolumeSpecName "kube-api-access-kmkn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.469396 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.546344 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmkn6\" (UniqueName: \"kubernetes.io/projected/1462b25e-1b95-4595-8d75-6c79da8b9d5a-kube-api-access-kmkn6\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.736363 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.747114 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-xbxmx"] Mar 07 07:14:00 crc kubenswrapper[4738]: I0307 07:14:00.897536 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-gr7mp"] Mar 07 07:14:00 crc kubenswrapper[4738]: W0307 07:14:00.907387 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37661a52_49a3_4a00_abde_d7a211efde35.slice/crio-be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64 WatchSource:0}: Error finding container be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64: Status 404 returned error can't find the container with id be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64 Mar 07 07:14:01 crc kubenswrapper[4738]: I0307 07:14:01.416083 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-qzgzb" event={"ID":"c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9","Type":"ContainerStarted","Data":"759140614f05a9abc2bbef03e8fc186141892062bc4a8e7abe7b70b5034e8cff"} Mar 07 07:14:01 crc kubenswrapper[4738]: I0307 07:14:01.420028 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" event={"ID":"37661a52-49a3-4a00-abde-d7a211efde35","Type":"ContainerStarted","Data":"be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64"} Mar 07 07:14:01 crc kubenswrapper[4738]: I0307 07:14:01.435598 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-qzgzb" podStartSLOduration=1.969189235 podStartE2EDuration="2.435571268s" podCreationTimestamp="2026-03-07 07:13:59 +0000 UTC" firstStartedPulling="2026-03-07 07:13:59.997336266 +0000 UTC m=+858.462323587" lastFinishedPulling="2026-03-07 07:14:00.463718299 +0000 UTC m=+858.928705620" observedRunningTime="2026-03-07 07:14:01.434119689 +0000 UTC m=+859.899107030" watchObservedRunningTime="2026-03-07 07:14:01.435571268 +0000 UTC m=+859.900558589" Mar 07 07:14:02 crc kubenswrapper[4738]: I0307 07:14:02.394122 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" path="/var/lib/kubelet/pods/1462b25e-1b95-4595-8d75-6c79da8b9d5a/volumes" Mar 07 07:14:02 crc kubenswrapper[4738]: I0307 07:14:02.430854 4738 generic.go:334] "Generic (PLEG): container finished" podID="37661a52-49a3-4a00-abde-d7a211efde35" containerID="dc65af93403a30c37b0d69574145cdcaa325ab4b9badbb646cee1887ddcd1f69" exitCode=0 Mar 07 07:14:02 crc kubenswrapper[4738]: I0307 07:14:02.430910 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" event={"ID":"37661a52-49a3-4a00-abde-d7a211efde35","Type":"ContainerDied","Data":"dc65af93403a30c37b0d69574145cdcaa325ab4b9badbb646cee1887ddcd1f69"} Mar 07 07:14:03 crc kubenswrapper[4738]: I0307 07:14:03.768007 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:03 crc kubenswrapper[4738]: I0307 07:14:03.799211 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrzwm\" (UniqueName: \"kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm\") pod \"37661a52-49a3-4a00-abde-d7a211efde35\" (UID: \"37661a52-49a3-4a00-abde-d7a211efde35\") " Mar 07 07:14:03 crc kubenswrapper[4738]: I0307 07:14:03.818454 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm" (OuterVolumeSpecName: "kube-api-access-xrzwm") pod "37661a52-49a3-4a00-abde-d7a211efde35" (UID: "37661a52-49a3-4a00-abde-d7a211efde35"). InnerVolumeSpecName "kube-api-access-xrzwm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:03 crc kubenswrapper[4738]: I0307 07:14:03.901436 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrzwm\" (UniqueName: \"kubernetes.io/projected/37661a52-49a3-4a00-abde-d7a211efde35-kube-api-access-xrzwm\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:04 crc kubenswrapper[4738]: I0307 07:14:04.455454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" event={"ID":"37661a52-49a3-4a00-abde-d7a211efde35","Type":"ContainerDied","Data":"be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64"} Mar 07 07:14:04 crc kubenswrapper[4738]: I0307 07:14:04.455503 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547794-gr7mp" Mar 07 07:14:04 crc kubenswrapper[4738]: I0307 07:14:04.455526 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be668f349247a4b20fd8e779d7e0b2e194c020281e992b72bfb4c2599c28ca64" Mar 07 07:14:04 crc kubenswrapper[4738]: I0307 07:14:04.860993 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-s4dz4"] Mar 07 07:14:04 crc kubenswrapper[4738]: I0307 07:14:04.869108 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547788-s4dz4"] Mar 07 07:14:06 crc kubenswrapper[4738]: I0307 07:14:06.395600 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6eeba904-9ec3-46e1-803d-d3c4fc286879" path="/var/lib/kubelet/pods/6eeba904-9ec3-46e1-803d-d3c4fc286879/volumes" Mar 07 07:14:07 crc kubenswrapper[4738]: I0307 07:14:07.884045 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-n9czq" Mar 07 07:14:09 crc kubenswrapper[4738]: I0307 07:14:09.544013 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:14:09 crc kubenswrapper[4738]: I0307 07:14:09.544214 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:14:09 crc kubenswrapper[4738]: I0307 07:14:09.579306 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:14:10 crc kubenswrapper[4738]: I0307 07:14:10.528375 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-qzgzb" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.841971 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6"] Mar 07 07:14:15 crc kubenswrapper[4738]: E0307 07:14:15.842604 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37661a52-49a3-4a00-abde-d7a211efde35" containerName="oc" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.842624 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="37661a52-49a3-4a00-abde-d7a211efde35" containerName="oc" Mar 07 07:14:15 crc kubenswrapper[4738]: E0307 07:14:15.842672 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" containerName="registry-server" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.842684 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" containerName="registry-server" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.842857 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="37661a52-49a3-4a00-abde-d7a211efde35" containerName="oc" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.842881 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1462b25e-1b95-4595-8d75-6c79da8b9d5a" containerName="registry-server" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.844231 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.846483 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.863825 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6"] Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.869722 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwzcq\" (UniqueName: \"kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.869799 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.869959 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.971750 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwzcq\" (UniqueName: \"kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.971823 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.971853 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.972633 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.972785 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:15 crc kubenswrapper[4738]: I0307 07:14:15.996886 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwzcq\" (UniqueName: \"kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq\") pod \"449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:16 crc kubenswrapper[4738]: I0307 07:14:16.168606 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:16 crc kubenswrapper[4738]: I0307 07:14:16.660300 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6"] Mar 07 07:14:17 crc kubenswrapper[4738]: I0307 07:14:17.547767 4738 generic.go:334] "Generic (PLEG): container finished" podID="b77797b2-b93a-4d76-9191-f334fde745fa" containerID="b2ae4f982000bb0900f5cfd53a2cfe710bca4b6fd03b35be747d95e0104751f9" exitCode=0 Mar 07 07:14:17 crc kubenswrapper[4738]: I0307 07:14:17.547845 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" event={"ID":"b77797b2-b93a-4d76-9191-f334fde745fa","Type":"ContainerDied","Data":"b2ae4f982000bb0900f5cfd53a2cfe710bca4b6fd03b35be747d95e0104751f9"} Mar 07 07:14:17 crc kubenswrapper[4738]: I0307 07:14:17.550747 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" event={"ID":"b77797b2-b93a-4d76-9191-f334fde745fa","Type":"ContainerStarted","Data":"80c799b8e90e71c9a85ec52e1011f7584d67a9948abb063ea6032325825b143f"} Mar 07 07:14:18 crc kubenswrapper[4738]: I0307 07:14:18.568744 4738 generic.go:334] "Generic (PLEG): container finished" podID="b77797b2-b93a-4d76-9191-f334fde745fa" containerID="6fceb7062d8ca86a23721f92c9f01377b9b9f15e41f57dde4d5ca272b761b846" exitCode=0 Mar 07 07:14:18 crc kubenswrapper[4738]: I0307 07:14:18.568868 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" event={"ID":"b77797b2-b93a-4d76-9191-f334fde745fa","Type":"ContainerDied","Data":"6fceb7062d8ca86a23721f92c9f01377b9b9f15e41f57dde4d5ca272b761b846"} Mar 07 07:14:19 crc kubenswrapper[4738]: I0307 07:14:19.591983 4738 generic.go:334] "Generic (PLEG): container finished" podID="b77797b2-b93a-4d76-9191-f334fde745fa" containerID="02703a6884c927da197717766a7e5fce5f20e7fbb372b7ff6b1ac9b554d52306" exitCode=0 Mar 07 07:14:19 crc kubenswrapper[4738]: I0307 07:14:19.592074 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" event={"ID":"b77797b2-b93a-4d76-9191-f334fde745fa","Type":"ContainerDied","Data":"02703a6884c927da197717766a7e5fce5f20e7fbb372b7ff6b1ac9b554d52306"} Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.881851 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.953366 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util\") pod \"b77797b2-b93a-4d76-9191-f334fde745fa\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.953500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle\") pod \"b77797b2-b93a-4d76-9191-f334fde745fa\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.953631 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwzcq\" (UniqueName: \"kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq\") pod \"b77797b2-b93a-4d76-9191-f334fde745fa\" (UID: \"b77797b2-b93a-4d76-9191-f334fde745fa\") " Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.955995 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle" (OuterVolumeSpecName: "bundle") pod "b77797b2-b93a-4d76-9191-f334fde745fa" (UID: "b77797b2-b93a-4d76-9191-f334fde745fa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.961291 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq" (OuterVolumeSpecName: "kube-api-access-lwzcq") pod "b77797b2-b93a-4d76-9191-f334fde745fa" (UID: "b77797b2-b93a-4d76-9191-f334fde745fa"). InnerVolumeSpecName "kube-api-access-lwzcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:20 crc kubenswrapper[4738]: I0307 07:14:20.967491 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util" (OuterVolumeSpecName: "util") pod "b77797b2-b93a-4d76-9191-f334fde745fa" (UID: "b77797b2-b93a-4d76-9191-f334fde745fa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.057313 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.057359 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b77797b2-b93a-4d76-9191-f334fde745fa-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.057375 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwzcq\" (UniqueName: \"kubernetes.io/projected/b77797b2-b93a-4d76-9191-f334fde745fa-kube-api-access-lwzcq\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.612457 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" event={"ID":"b77797b2-b93a-4d76-9191-f334fde745fa","Type":"ContainerDied","Data":"80c799b8e90e71c9a85ec52e1011f7584d67a9948abb063ea6032325825b143f"} Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.612974 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80c799b8e90e71c9a85ec52e1011f7584d67a9948abb063ea6032325825b143f" Mar 07 07:14:21 crc kubenswrapper[4738]: I0307 07:14:21.612588 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6" Mar 07 07:14:26 crc kubenswrapper[4738]: I0307 07:14:26.957571 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:14:26 crc kubenswrapper[4738]: I0307 07:14:26.957850 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.479410 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw"] Mar 07 07:14:29 crc kubenswrapper[4738]: E0307 07:14:29.479946 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="util" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.479960 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="util" Mar 07 07:14:29 crc kubenswrapper[4738]: E0307 07:14:29.480146 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="extract" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.480190 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="extract" Mar 07 07:14:29 crc kubenswrapper[4738]: E0307 07:14:29.480212 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="pull" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.480242 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="pull" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.480452 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77797b2-b93a-4d76-9191-f334fde745fa" containerName="extract" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.480983 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.484105 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.484533 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.486861 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-s4nh5" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.547768 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw"] Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.589038 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6fvm\" (UniqueName: \"kubernetes.io/projected/157f5b45-2eb2-4da7-bea9-722b108fa769-kube-api-access-q6fvm\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.589288 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-apiservice-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.589399 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-webhook-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.691400 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6fvm\" (UniqueName: \"kubernetes.io/projected/157f5b45-2eb2-4da7-bea9-722b108fa769-kube-api-access-q6fvm\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.691497 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-apiservice-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.691530 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-webhook-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.698162 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-webhook-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.703615 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/157f5b45-2eb2-4da7-bea9-722b108fa769-apiservice-cert\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.711905 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6fvm\" (UniqueName: \"kubernetes.io/projected/157f5b45-2eb2-4da7-bea9-722b108fa769-kube-api-access-q6fvm\") pod \"mariadb-operator-controller-manager-7db7f5474b-5qcsw\" (UID: \"157f5b45-2eb2-4da7-bea9-722b108fa769\") " pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:29 crc kubenswrapper[4738]: I0307 07:14:29.801753 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:30 crc kubenswrapper[4738]: I0307 07:14:30.051382 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw"] Mar 07 07:14:30 crc kubenswrapper[4738]: I0307 07:14:30.687371 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" event={"ID":"157f5b45-2eb2-4da7-bea9-722b108fa769","Type":"ContainerStarted","Data":"9a3239a24009b6d9406a56c7615c3b145874e7931af1cdea70b4f7c8f10de232"} Mar 07 07:14:34 crc kubenswrapper[4738]: I0307 07:14:34.728061 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" event={"ID":"157f5b45-2eb2-4da7-bea9-722b108fa769","Type":"ContainerStarted","Data":"090c3b52ca675a2fd34f6934fe81db7bb4ecc86d9fac106857d78a9557638d7e"} Mar 07 07:14:34 crc kubenswrapper[4738]: I0307 07:14:34.730307 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:34 crc kubenswrapper[4738]: I0307 07:14:34.751707 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" podStartSLOduration=2.064262453 podStartE2EDuration="5.751673576s" podCreationTimestamp="2026-03-07 07:14:29 +0000 UTC" firstStartedPulling="2026-03-07 07:14:30.064312593 +0000 UTC m=+888.529299914" lastFinishedPulling="2026-03-07 07:14:33.751723716 +0000 UTC m=+892.216711037" observedRunningTime="2026-03-07 07:14:34.747361782 +0000 UTC m=+893.212349093" watchObservedRunningTime="2026-03-07 07:14:34.751673576 +0000 UTC m=+893.216660897" Mar 07 07:14:39 crc kubenswrapper[4738]: I0307 07:14:39.807259 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7db7f5474b-5qcsw" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.324776 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.326299 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.333183 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-q54vm" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.373912 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.435600 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q6bd\" (UniqueName: \"kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd\") pod \"infra-operator-index-mjln4\" (UID: \"87e39117-3f2f-4d5c-97ba-8e119c415858\") " pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.537344 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q6bd\" (UniqueName: \"kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd\") pod \"infra-operator-index-mjln4\" (UID: \"87e39117-3f2f-4d5c-97ba-8e119c415858\") " pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.715628 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q6bd\" (UniqueName: \"kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd\") pod \"infra-operator-index-mjln4\" (UID: \"87e39117-3f2f-4d5c-97ba-8e119c415858\") " pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.804073 4738 scope.go:117] "RemoveContainer" containerID="b1be626607c2a7a38c792c624e09efeecef7f88bb38aa1f7524c723c15aad2ec" Mar 07 07:14:43 crc kubenswrapper[4738]: I0307 07:14:43.944687 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:44 crc kubenswrapper[4738]: W0307 07:14:44.538633 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87e39117_3f2f_4d5c_97ba_8e119c415858.slice/crio-115e31867a79ed73b088923255e0258e6edad6d76f839b1e43fd0e690a6294a1 WatchSource:0}: Error finding container 115e31867a79ed73b088923255e0258e6edad6d76f839b1e43fd0e690a6294a1: Status 404 returned error can't find the container with id 115e31867a79ed73b088923255e0258e6edad6d76f839b1e43fd0e690a6294a1 Mar 07 07:14:44 crc kubenswrapper[4738]: I0307 07:14:44.543647 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:44 crc kubenswrapper[4738]: I0307 07:14:44.800631 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjln4" event={"ID":"87e39117-3f2f-4d5c-97ba-8e119c415858","Type":"ContainerStarted","Data":"115e31867a79ed73b088923255e0258e6edad6d76f839b1e43fd0e690a6294a1"} Mar 07 07:14:45 crc kubenswrapper[4738]: I0307 07:14:45.807783 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjln4" event={"ID":"87e39117-3f2f-4d5c-97ba-8e119c415858","Type":"ContainerStarted","Data":"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7"} Mar 07 07:14:45 crc kubenswrapper[4738]: I0307 07:14:45.826955 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-mjln4" podStartSLOduration=2.101946251 podStartE2EDuration="2.826931284s" podCreationTimestamp="2026-03-07 07:14:43 +0000 UTC" firstStartedPulling="2026-03-07 07:14:44.542104091 +0000 UTC m=+903.007091412" lastFinishedPulling="2026-03-07 07:14:45.267089124 +0000 UTC m=+903.732076445" observedRunningTime="2026-03-07 07:14:45.823196583 +0000 UTC m=+904.288183934" watchObservedRunningTime="2026-03-07 07:14:45.826931284 +0000 UTC m=+904.291918645" Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.086966 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.696866 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-vmk7s"] Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.697991 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.712953 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vmk7s"] Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.789913 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8khx\" (UniqueName: \"kubernetes.io/projected/052550e0-03bd-4c62-8393-b43abaafb198-kube-api-access-x8khx\") pod \"infra-operator-index-vmk7s\" (UID: \"052550e0-03bd-4c62-8393-b43abaafb198\") " pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.892190 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8khx\" (UniqueName: \"kubernetes.io/projected/052550e0-03bd-4c62-8393-b43abaafb198-kube-api-access-x8khx\") pod \"infra-operator-index-vmk7s\" (UID: \"052550e0-03bd-4c62-8393-b43abaafb198\") " pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:46 crc kubenswrapper[4738]: I0307 07:14:46.917855 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8khx\" (UniqueName: \"kubernetes.io/projected/052550e0-03bd-4c62-8393-b43abaafb198-kube-api-access-x8khx\") pod \"infra-operator-index-vmk7s\" (UID: \"052550e0-03bd-4c62-8393-b43abaafb198\") " pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:47 crc kubenswrapper[4738]: I0307 07:14:47.014955 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:47 crc kubenswrapper[4738]: I0307 07:14:47.537928 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-vmk7s"] Mar 07 07:14:47 crc kubenswrapper[4738]: I0307 07:14:47.826094 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vmk7s" event={"ID":"052550e0-03bd-4c62-8393-b43abaafb198","Type":"ContainerStarted","Data":"f5e2dce8f15408d1e3732fa720e28e680edea74522b0588cdadc7cf52e66b5db"} Mar 07 07:14:47 crc kubenswrapper[4738]: I0307 07:14:47.826254 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-mjln4" podUID="87e39117-3f2f-4d5c-97ba-8e119c415858" containerName="registry-server" containerID="cri-o://14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7" gracePeriod=2 Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.240258 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.313790 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2q6bd\" (UniqueName: \"kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd\") pod \"87e39117-3f2f-4d5c-97ba-8e119c415858\" (UID: \"87e39117-3f2f-4d5c-97ba-8e119c415858\") " Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.327420 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd" (OuterVolumeSpecName: "kube-api-access-2q6bd") pod "87e39117-3f2f-4d5c-97ba-8e119c415858" (UID: "87e39117-3f2f-4d5c-97ba-8e119c415858"). InnerVolumeSpecName "kube-api-access-2q6bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.415548 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2q6bd\" (UniqueName: \"kubernetes.io/projected/87e39117-3f2f-4d5c-97ba-8e119c415858-kube-api-access-2q6bd\") on node \"crc\" DevicePath \"\"" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.836659 4738 generic.go:334] "Generic (PLEG): container finished" podID="87e39117-3f2f-4d5c-97ba-8e119c415858" containerID="14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7" exitCode=0 Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.836751 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjln4" event={"ID":"87e39117-3f2f-4d5c-97ba-8e119c415858","Type":"ContainerDied","Data":"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7"} Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.837276 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-mjln4" event={"ID":"87e39117-3f2f-4d5c-97ba-8e119c415858","Type":"ContainerDied","Data":"115e31867a79ed73b088923255e0258e6edad6d76f839b1e43fd0e690a6294a1"} Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.836799 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-mjln4" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.837344 4738 scope.go:117] "RemoveContainer" containerID="14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.838722 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-vmk7s" event={"ID":"052550e0-03bd-4c62-8393-b43abaafb198","Type":"ContainerStarted","Data":"91cfe594c246f413abfa835effd1ec79590300718d69aa7d7bcbf2e8538a9f58"} Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.858691 4738 scope.go:117] "RemoveContainer" containerID="14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7" Mar 07 07:14:48 crc kubenswrapper[4738]: E0307 07:14:48.859299 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7\": container with ID starting with 14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7 not found: ID does not exist" containerID="14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.859363 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7"} err="failed to get container status \"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7\": rpc error: code = NotFound desc = could not find container \"14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7\": container with ID starting with 14400926a09331ace6e55dd087fed2830952ab097648ffb68f0cd0b6cce9dbb7 not found: ID does not exist" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.872400 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-vmk7s" podStartSLOduration=2.315363863 podStartE2EDuration="2.872368818s" podCreationTimestamp="2026-03-07 07:14:46 +0000 UTC" firstStartedPulling="2026-03-07 07:14:47.54452502 +0000 UTC m=+906.009512341" lastFinishedPulling="2026-03-07 07:14:48.101529975 +0000 UTC m=+906.566517296" observedRunningTime="2026-03-07 07:14:48.868438212 +0000 UTC m=+907.333425553" watchObservedRunningTime="2026-03-07 07:14:48.872368818 +0000 UTC m=+907.337356149" Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.890196 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:48 crc kubenswrapper[4738]: I0307 07:14:48.895624 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-mjln4"] Mar 07 07:14:50 crc kubenswrapper[4738]: I0307 07:14:50.398880 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e39117-3f2f-4d5c-97ba-8e119c415858" path="/var/lib/kubelet/pods/87e39117-3f2f-4d5c-97ba-8e119c415858/volumes" Mar 07 07:14:56 crc kubenswrapper[4738]: I0307 07:14:56.958676 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:14:56 crc kubenswrapper[4738]: I0307 07:14:56.959864 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:14:57 crc kubenswrapper[4738]: I0307 07:14:57.015744 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:57 crc kubenswrapper[4738]: I0307 07:14:57.016222 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:57 crc kubenswrapper[4738]: I0307 07:14:57.072327 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:57 crc kubenswrapper[4738]: I0307 07:14:57.926702 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-vmk7s" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.752975 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28"] Mar 07 07:14:58 crc kubenswrapper[4738]: E0307 07:14:58.753613 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e39117-3f2f-4d5c-97ba-8e119c415858" containerName="registry-server" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.753634 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e39117-3f2f-4d5c-97ba-8e119c415858" containerName="registry-server" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.753932 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e39117-3f2f-4d5c-97ba-8e119c415858" containerName="registry-server" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.756280 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.759878 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.782234 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28"] Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.881746 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckvsw\" (UniqueName: \"kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.881805 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.881868 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.983547 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckvsw\" (UniqueName: \"kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.983593 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.983642 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.984070 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:58 crc kubenswrapper[4738]: I0307 07:14:58.984664 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.011248 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckvsw\" (UniqueName: \"kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw\") pod \"c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.088222 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.302761 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28"] Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.917712 4738 generic.go:334] "Generic (PLEG): container finished" podID="ff8317ef-e259-4059-b434-33fdfd265c20" containerID="26282169619bb58571365685ad7a94237a271174a4582c280fd560d9e344734c" exitCode=0 Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.917808 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" event={"ID":"ff8317ef-e259-4059-b434-33fdfd265c20","Type":"ContainerDied","Data":"26282169619bb58571365685ad7a94237a271174a4582c280fd560d9e344734c"} Mar 07 07:14:59 crc kubenswrapper[4738]: I0307 07:14:59.918053 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" event={"ID":"ff8317ef-e259-4059-b434-33fdfd265c20","Type":"ContainerStarted","Data":"dc41896b6365810aba0578293bf859014c7729a3941d2d12fb4c419dd685aaf6"} Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.130210 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h"] Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.131429 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.134985 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.135194 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.137942 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h"] Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.201603 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.201667 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rndxv\" (UniqueName: \"kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.201733 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.303216 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.303400 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.303441 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rndxv\" (UniqueName: \"kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.304148 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.317011 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.323375 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rndxv\" (UniqueName: \"kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv\") pod \"collect-profiles-29547795-x997h\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.449104 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.929771 4738 generic.go:334] "Generic (PLEG): container finished" podID="ff8317ef-e259-4059-b434-33fdfd265c20" containerID="b59d9ce3904f25d07b7e3c558a9caa48ffce9a725e3e022d3f86a455e81549f4" exitCode=0 Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.929864 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" event={"ID":"ff8317ef-e259-4059-b434-33fdfd265c20","Type":"ContainerDied","Data":"b59d9ce3904f25d07b7e3c558a9caa48ffce9a725e3e022d3f86a455e81549f4"} Mar 07 07:15:00 crc kubenswrapper[4738]: I0307 07:15:00.945120 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h"] Mar 07 07:15:00 crc kubenswrapper[4738]: W0307 07:15:00.952843 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod158afdca_c9af_440e_a093_d25b482a5fd6.slice/crio-50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5 WatchSource:0}: Error finding container 50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5: Status 404 returned error can't find the container with id 50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5 Mar 07 07:15:01 crc kubenswrapper[4738]: I0307 07:15:01.938277 4738 generic.go:334] "Generic (PLEG): container finished" podID="158afdca-c9af-440e-a093-d25b482a5fd6" containerID="c4b7a56716a2b5c58fe7677bca122cbe799e1cd2393c1137e40a5fd262c8b2fe" exitCode=0 Mar 07 07:15:01 crc kubenswrapper[4738]: I0307 07:15:01.938361 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" event={"ID":"158afdca-c9af-440e-a093-d25b482a5fd6","Type":"ContainerDied","Data":"c4b7a56716a2b5c58fe7677bca122cbe799e1cd2393c1137e40a5fd262c8b2fe"} Mar 07 07:15:01 crc kubenswrapper[4738]: I0307 07:15:01.938699 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" event={"ID":"158afdca-c9af-440e-a093-d25b482a5fd6","Type":"ContainerStarted","Data":"50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5"} Mar 07 07:15:01 crc kubenswrapper[4738]: I0307 07:15:01.944444 4738 generic.go:334] "Generic (PLEG): container finished" podID="ff8317ef-e259-4059-b434-33fdfd265c20" containerID="0b4ca63af98a26f5e01913438023d369114390c53aa1e2a3a3e3888f2997a881" exitCode=0 Mar 07 07:15:01 crc kubenswrapper[4738]: I0307 07:15:01.944523 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" event={"ID":"ff8317ef-e259-4059-b434-33fdfd265c20","Type":"ContainerDied","Data":"0b4ca63af98a26f5e01913438023d369114390c53aa1e2a3a3e3888f2997a881"} Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.220140 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.281279 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.362482 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume\") pod \"158afdca-c9af-440e-a093-d25b482a5fd6\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.362595 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle\") pod \"ff8317ef-e259-4059-b434-33fdfd265c20\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.362629 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util\") pod \"ff8317ef-e259-4059-b434-33fdfd265c20\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.362696 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rndxv\" (UniqueName: \"kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv\") pod \"158afdca-c9af-440e-a093-d25b482a5fd6\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.363486 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume" (OuterVolumeSpecName: "config-volume") pod "158afdca-c9af-440e-a093-d25b482a5fd6" (UID: "158afdca-c9af-440e-a093-d25b482a5fd6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.364069 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume\") pod \"158afdca-c9af-440e-a093-d25b482a5fd6\" (UID: \"158afdca-c9af-440e-a093-d25b482a5fd6\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.364108 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckvsw\" (UniqueName: \"kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw\") pod \"ff8317ef-e259-4059-b434-33fdfd265c20\" (UID: \"ff8317ef-e259-4059-b434-33fdfd265c20\") " Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.364635 4738 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/158afdca-c9af-440e-a093-d25b482a5fd6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.365787 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle" (OuterVolumeSpecName: "bundle") pod "ff8317ef-e259-4059-b434-33fdfd265c20" (UID: "ff8317ef-e259-4059-b434-33fdfd265c20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.367677 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "158afdca-c9af-440e-a093-d25b482a5fd6" (UID: "158afdca-c9af-440e-a093-d25b482a5fd6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.367738 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv" (OuterVolumeSpecName: "kube-api-access-rndxv") pod "158afdca-c9af-440e-a093-d25b482a5fd6" (UID: "158afdca-c9af-440e-a093-d25b482a5fd6"). InnerVolumeSpecName "kube-api-access-rndxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.371586 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw" (OuterVolumeSpecName: "kube-api-access-ckvsw") pod "ff8317ef-e259-4059-b434-33fdfd265c20" (UID: "ff8317ef-e259-4059-b434-33fdfd265c20"). InnerVolumeSpecName "kube-api-access-ckvsw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.375749 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util" (OuterVolumeSpecName: "util") pod "ff8317ef-e259-4059-b434-33fdfd265c20" (UID: "ff8317ef-e259-4059-b434-33fdfd265c20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.465907 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.466064 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ff8317ef-e259-4059-b434-33fdfd265c20-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.466102 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rndxv\" (UniqueName: \"kubernetes.io/projected/158afdca-c9af-440e-a093-d25b482a5fd6-kube-api-access-rndxv\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.466116 4738 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/158afdca-c9af-440e-a093-d25b482a5fd6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.466129 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckvsw\" (UniqueName: \"kubernetes.io/projected/ff8317ef-e259-4059-b434-33fdfd265c20-kube-api-access-ckvsw\") on node \"crc\" DevicePath \"\"" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.960834 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.960750 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547795-x997h" event={"ID":"158afdca-c9af-440e-a093-d25b482a5fd6","Type":"ContainerDied","Data":"50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5"} Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.961239 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50d5ee3b094de8b715bd1c91008595ae1139675b0bb14d132da4adbdf6a376e5" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.964047 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" event={"ID":"ff8317ef-e259-4059-b434-33fdfd265c20","Type":"ContainerDied","Data":"dc41896b6365810aba0578293bf859014c7729a3941d2d12fb4c419dd685aaf6"} Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.964087 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc41896b6365810aba0578293bf859014c7729a3941d2d12fb4c419dd685aaf6" Mar 07 07:15:03 crc kubenswrapper[4738]: I0307 07:15:03.964120 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.229285 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt"] Mar 07 07:15:17 crc kubenswrapper[4738]: E0307 07:15:17.230510 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="pull" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230530 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="pull" Mar 07 07:15:17 crc kubenswrapper[4738]: E0307 07:15:17.230554 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="158afdca-c9af-440e-a093-d25b482a5fd6" containerName="collect-profiles" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230560 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="158afdca-c9af-440e-a093-d25b482a5fd6" containerName="collect-profiles" Mar 07 07:15:17 crc kubenswrapper[4738]: E0307 07:15:17.230572 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="util" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230578 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="util" Mar 07 07:15:17 crc kubenswrapper[4738]: E0307 07:15:17.230590 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="extract" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230596 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="extract" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230703 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff8317ef-e259-4059-b434-33fdfd265c20" containerName="extract" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.230722 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="158afdca-c9af-440e-a093-d25b482a5fd6" containerName="collect-profiles" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.231287 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.234438 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-fhjl4" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.234507 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.251810 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt"] Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.364067 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-apiservice-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.364193 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvkmq\" (UniqueName: \"kubernetes.io/projected/22276e47-e857-4b2f-a2ef-0976a50c1294-kube-api-access-jvkmq\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.364269 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-webhook-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.465432 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-apiservice-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.465608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvkmq\" (UniqueName: \"kubernetes.io/projected/22276e47-e857-4b2f-a2ef-0976a50c1294-kube-api-access-jvkmq\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.465749 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-webhook-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.472956 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-webhook-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.473379 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/22276e47-e857-4b2f-a2ef-0976a50c1294-apiservice-cert\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.491369 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvkmq\" (UniqueName: \"kubernetes.io/projected/22276e47-e857-4b2f-a2ef-0976a50c1294-kube-api-access-jvkmq\") pod \"infra-operator-controller-manager-676c9dd8fb-jzgbt\" (UID: \"22276e47-e857-4b2f-a2ef-0976a50c1294\") " pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.554101 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.847052 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt"] Mar 07 07:15:17 crc kubenswrapper[4738]: W0307 07:15:17.858570 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22276e47_e857_4b2f_a2ef_0976a50c1294.slice/crio-d7e7859069fe5829c85ab5b924e67bccd00f394e257cab221444c96c0156b5a6 WatchSource:0}: Error finding container d7e7859069fe5829c85ab5b924e67bccd00f394e257cab221444c96c0156b5a6: Status 404 returned error can't find the container with id d7e7859069fe5829c85ab5b924e67bccd00f394e257cab221444c96c0156b5a6 Mar 07 07:15:17 crc kubenswrapper[4738]: I0307 07:15:17.863054 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:15:18 crc kubenswrapper[4738]: I0307 07:15:18.078757 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" event={"ID":"22276e47-e857-4b2f-a2ef-0976a50c1294","Type":"ContainerStarted","Data":"d7e7859069fe5829c85ab5b924e67bccd00f394e257cab221444c96c0156b5a6"} Mar 07 07:15:20 crc kubenswrapper[4738]: I0307 07:15:20.102720 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" event={"ID":"22276e47-e857-4b2f-a2ef-0976a50c1294","Type":"ContainerStarted","Data":"9bca3c906dd478860d5851e910640b6c959f61b527b231ed141d399c277c5407"} Mar 07 07:15:20 crc kubenswrapper[4738]: I0307 07:15:20.103132 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:20 crc kubenswrapper[4738]: I0307 07:15:20.128853 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" podStartSLOduration=1.1672602300000001 podStartE2EDuration="3.128831089s" podCreationTimestamp="2026-03-07 07:15:17 +0000 UTC" firstStartedPulling="2026-03-07 07:15:17.862783916 +0000 UTC m=+936.327771237" lastFinishedPulling="2026-03-07 07:15:19.824354765 +0000 UTC m=+938.289342096" observedRunningTime="2026-03-07 07:15:20.124393621 +0000 UTC m=+938.589380942" watchObservedRunningTime="2026-03-07 07:15:20.128831089 +0000 UTC m=+938.593818420" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.607286 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.609550 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.612204 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"kube-root-ca.crt" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.612540 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openshift-service-ca.crt" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.612759 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"galera-openstack-dockercfg-mbd2t" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.612839 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-scripts" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.614453 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"openstack-config-data" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.626731 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.628356 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.633985 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.635650 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.641226 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.654254 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.660004 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.690922 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mlx\" (UniqueName: \"kubernetes.io/projected/a9054324-a882-4b95-adda-bde2ea2ab268-kube-api-access-t7mlx\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.690983 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-kolla-config\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691029 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-default\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691057 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691086 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqbcj\" (UniqueName: \"kubernetes.io/projected/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kube-api-access-qqbcj\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691112 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kolla-config\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691139 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691166 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-operator-scripts\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691211 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-default\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691254 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691278 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.691312 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-generated\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793015 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-generated\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793097 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-kolla-config\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793138 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mlx\" (UniqueName: \"kubernetes.io/projected/a9054324-a882-4b95-adda-bde2ea2ab268-kube-api-access-t7mlx\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793160 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-default\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793207 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-kolla-config\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793243 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-generated\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793266 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsjc5\" (UniqueName: \"kubernetes.io/projected/178ed054-c2f4-450d-99b8-88fd079432cf-kube-api-access-rsjc5\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793294 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-default\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793313 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793335 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qqbcj\" (UniqueName: \"kubernetes.io/projected/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kube-api-access-qqbcj\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793356 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kolla-config\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793375 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-operator-scripts\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793393 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793412 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793486 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-operator-scripts\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793512 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-default\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793533 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793557 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.793846 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-generated\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.794363 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") device mount path \"/mnt/openstack/pv01\"" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.794903 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-generated\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.795511 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kolla-config\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.795571 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-operator-scripts\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.795525 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-kolla-config\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.795657 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") device mount path \"/mnt/openstack/pv03\"" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.796672 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-config-data-default\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.797191 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/a9054324-a882-4b95-adda-bde2ea2ab268-config-data-default\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.797340 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/acf747a6-cf73-4b69-a1b7-bf95cde29f63-operator-scripts\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.815952 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqbcj\" (UniqueName: \"kubernetes.io/projected/acf747a6-cf73-4b69-a1b7-bf95cde29f63-kube-api-access-qqbcj\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.818257 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mlx\" (UniqueName: \"kubernetes.io/projected/a9054324-a882-4b95-adda-bde2ea2ab268-kube-api-access-t7mlx\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.818523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"a9054324-a882-4b95-adda-bde2ea2ab268\") " pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.819063 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-1\" (UID: \"acf747a6-cf73-4b69-a1b7-bf95cde29f63\") " pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895290 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-kolla-config\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895388 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-default\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895430 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-generated\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895457 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsjc5\" (UniqueName: \"kubernetes.io/projected/178ed054-c2f4-450d-99b8-88fd079432cf-kube-api-access-rsjc5\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895511 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-operator-scripts\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895544 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.895774 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") device mount path \"/mnt/openstack/pv08\"" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.896417 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-generated\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.896690 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-kolla-config\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.897004 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-config-data-default\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.897769 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/178ed054-c2f4-450d-99b8-88fd079432cf-operator-scripts\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.919298 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.937920 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsjc5\" (UniqueName: \"kubernetes.io/projected/178ed054-c2f4-450d-99b8-88fd079432cf-kube-api-access-rsjc5\") pod \"openstack-galera-2\" (UID: \"178ed054-c2f4-450d-99b8-88fd079432cf\") " pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.938568 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.972620 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:25 crc kubenswrapper[4738]: I0307 07:15:25.980728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.280469 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-0"] Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.557698 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-2"] Mar 07 07:15:26 crc kubenswrapper[4738]: W0307 07:15:26.560134 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod178ed054_c2f4_450d_99b8_88fd079432cf.slice/crio-2d79141671b24ed84233a3ab55b6038d03c01a826bbfe57747ccfdd5bc576285 WatchSource:0}: Error finding container 2d79141671b24ed84233a3ab55b6038d03c01a826bbfe57747ccfdd5bc576285: Status 404 returned error can't find the container with id 2d79141671b24ed84233a3ab55b6038d03c01a826bbfe57747ccfdd5bc576285 Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.562535 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/openstack-galera-1"] Mar 07 07:15:26 crc kubenswrapper[4738]: W0307 07:15:26.564523 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacf747a6_cf73_4b69_a1b7_bf95cde29f63.slice/crio-27259d623c67c8340e7f13eca9534dc4da4b95dbaa7e52511e82f1064fee92fa WatchSource:0}: Error finding container 27259d623c67c8340e7f13eca9534dc4da4b95dbaa7e52511e82f1064fee92fa: Status 404 returned error can't find the container with id 27259d623c67c8340e7f13eca9534dc4da4b95dbaa7e52511e82f1064fee92fa Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.957734 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.957809 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.957857 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.958550 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:15:26 crc kubenswrapper[4738]: I0307 07:15:26.958606 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7" gracePeriod=600 Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.162244 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"178ed054-c2f4-450d-99b8-88fd079432cf","Type":"ContainerStarted","Data":"2d79141671b24ed84233a3ab55b6038d03c01a826bbfe57747ccfdd5bc576285"} Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.163426 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"a9054324-a882-4b95-adda-bde2ea2ab268","Type":"ContainerStarted","Data":"1e4791d8fb1bbe62b1c3ba889811c1975deb6fba31073c0eb43758347f9173eb"} Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.165678 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7" exitCode=0 Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.165721 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7"} Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.165775 4738 scope.go:117] "RemoveContainer" containerID="f23a128c52dea9413300b9d6a04d09e3f706df7a2a254675e7571a8487859e17" Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.166936 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"acf747a6-cf73-4b69-a1b7-bf95cde29f63","Type":"ContainerStarted","Data":"27259d623c67c8340e7f13eca9534dc4da4b95dbaa7e52511e82f1064fee92fa"} Mar 07 07:15:27 crc kubenswrapper[4738]: I0307 07:15:27.559473 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-676c9dd8fb-jzgbt" Mar 07 07:15:28 crc kubenswrapper[4738]: I0307 07:15:28.177313 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7"} Mar 07 07:15:31 crc kubenswrapper[4738]: I0307 07:15:31.719000 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-xmcn5"] Mar 07 07:15:31 crc kubenswrapper[4738]: I0307 07:15:31.723384 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:31 crc kubenswrapper[4738]: I0307 07:15:31.725856 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-jbscs" Mar 07 07:15:31 crc kubenswrapper[4738]: I0307 07:15:31.748631 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-xmcn5"] Mar 07 07:15:31 crc kubenswrapper[4738]: I0307 07:15:31.898706 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzkj6\" (UniqueName: \"kubernetes.io/projected/5bad6865-faaa-4032-b282-4d5a699bd4e2-kube-api-access-pzkj6\") pod \"rabbitmq-cluster-operator-index-xmcn5\" (UID: \"5bad6865-faaa-4032-b282-4d5a699bd4e2\") " pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:32 crc kubenswrapper[4738]: I0307 07:15:32.000128 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzkj6\" (UniqueName: \"kubernetes.io/projected/5bad6865-faaa-4032-b282-4d5a699bd4e2-kube-api-access-pzkj6\") pod \"rabbitmq-cluster-operator-index-xmcn5\" (UID: \"5bad6865-faaa-4032-b282-4d5a699bd4e2\") " pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:32 crc kubenswrapper[4738]: I0307 07:15:32.027995 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzkj6\" (UniqueName: \"kubernetes.io/projected/5bad6865-faaa-4032-b282-4d5a699bd4e2-kube-api-access-pzkj6\") pod \"rabbitmq-cluster-operator-index-xmcn5\" (UID: \"5bad6865-faaa-4032-b282-4d5a699bd4e2\") " pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:32 crc kubenswrapper[4738]: I0307 07:15:32.048612 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.355800 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.357239 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.358913 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"memcached-memcached-dockercfg-nthvg" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.359912 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"memcached-config-data" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.396751 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.544273 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxlx2\" (UniqueName: \"kubernetes.io/projected/18680f34-6448-42b6-bb6d-c87e7f1becb7-kube-api-access-cxlx2\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.544471 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-kolla-config\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.544583 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-config-data\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.646850 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxlx2\" (UniqueName: \"kubernetes.io/projected/18680f34-6448-42b6-bb6d-c87e7f1becb7-kube-api-access-cxlx2\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.646952 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-kolla-config\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.646998 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-config-data\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.648212 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-kolla-config\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.648270 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/18680f34-6448-42b6-bb6d-c87e7f1becb7-config-data\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.674456 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxlx2\" (UniqueName: \"kubernetes.io/projected/18680f34-6448-42b6-bb6d-c87e7f1becb7-kube-api-access-cxlx2\") pod \"memcached-0\" (UID: \"18680f34-6448-42b6-bb6d-c87e7f1becb7\") " pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:34 crc kubenswrapper[4738]: I0307 07:15:34.677401 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:37 crc kubenswrapper[4738]: I0307 07:15:37.457286 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/memcached-0"] Mar 07 07:15:37 crc kubenswrapper[4738]: W0307 07:15:37.460610 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18680f34_6448_42b6_bb6d_c87e7f1becb7.slice/crio-c84ed3e9bb696e3ad52df37b57e38be9b11d71aefdbc5e254502b05e1efa5b9a WatchSource:0}: Error finding container c84ed3e9bb696e3ad52df37b57e38be9b11d71aefdbc5e254502b05e1efa5b9a: Status 404 returned error can't find the container with id c84ed3e9bb696e3ad52df37b57e38be9b11d71aefdbc5e254502b05e1efa5b9a Mar 07 07:15:37 crc kubenswrapper[4738]: I0307 07:15:37.591830 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-xmcn5"] Mar 07 07:15:37 crc kubenswrapper[4738]: W0307 07:15:37.599435 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bad6865_faaa_4032_b282_4d5a699bd4e2.slice/crio-d043342ebd4ac1b6f74e5c0f52416307b2d4d94c1340039957ecb503e7919bee WatchSource:0}: Error finding container d043342ebd4ac1b6f74e5c0f52416307b2d4d94c1340039957ecb503e7919bee: Status 404 returned error can't find the container with id d043342ebd4ac1b6f74e5c0f52416307b2d4d94c1340039957ecb503e7919bee Mar 07 07:15:38 crc kubenswrapper[4738]: I0307 07:15:38.328494 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"acf747a6-cf73-4b69-a1b7-bf95cde29f63","Type":"ContainerStarted","Data":"7e87326ceb2345075ec756aeade6d158efa093c5d087fadddb35fb648ab7d684"} Mar 07 07:15:38 crc kubenswrapper[4738]: I0307 07:15:38.330187 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" event={"ID":"5bad6865-faaa-4032-b282-4d5a699bd4e2","Type":"ContainerStarted","Data":"d043342ebd4ac1b6f74e5c0f52416307b2d4d94c1340039957ecb503e7919bee"} Mar 07 07:15:38 crc kubenswrapper[4738]: I0307 07:15:38.331414 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"178ed054-c2f4-450d-99b8-88fd079432cf","Type":"ContainerStarted","Data":"88cb1c26cd51ac24ec7ff53c99ed39c2cbf68ee0a58104b7cd6e207478ba25bb"} Mar 07 07:15:38 crc kubenswrapper[4738]: I0307 07:15:38.332729 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"a9054324-a882-4b95-adda-bde2ea2ab268","Type":"ContainerStarted","Data":"2fb616b454b0afa314c51396c3130c34746d6779b1da447e59c329517d6529ef"} Mar 07 07:15:38 crc kubenswrapper[4738]: I0307 07:15:38.335220 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"18680f34-6448-42b6-bb6d-c87e7f1becb7","Type":"ContainerStarted","Data":"c84ed3e9bb696e3ad52df37b57e38be9b11d71aefdbc5e254502b05e1efa5b9a"} Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.368869 4738 generic.go:334] "Generic (PLEG): container finished" podID="178ed054-c2f4-450d-99b8-88fd079432cf" containerID="88cb1c26cd51ac24ec7ff53c99ed39c2cbf68ee0a58104b7cd6e207478ba25bb" exitCode=0 Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.368927 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"178ed054-c2f4-450d-99b8-88fd079432cf","Type":"ContainerDied","Data":"88cb1c26cd51ac24ec7ff53c99ed39c2cbf68ee0a58104b7cd6e207478ba25bb"} Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.371958 4738 generic.go:334] "Generic (PLEG): container finished" podID="a9054324-a882-4b95-adda-bde2ea2ab268" containerID="2fb616b454b0afa314c51396c3130c34746d6779b1da447e59c329517d6529ef" exitCode=0 Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.372006 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"a9054324-a882-4b95-adda-bde2ea2ab268","Type":"ContainerDied","Data":"2fb616b454b0afa314c51396c3130c34746d6779b1da447e59c329517d6529ef"} Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.375240 4738 generic.go:334] "Generic (PLEG): container finished" podID="acf747a6-cf73-4b69-a1b7-bf95cde29f63" containerID="7e87326ceb2345075ec756aeade6d158efa093c5d087fadddb35fb648ab7d684" exitCode=0 Mar 07 07:15:42 crc kubenswrapper[4738]: I0307 07:15:42.375268 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"acf747a6-cf73-4b69-a1b7-bf95cde29f63","Type":"ContainerDied","Data":"7e87326ceb2345075ec756aeade6d158efa093c5d087fadddb35fb648ab7d684"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.385280 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/memcached-0" event={"ID":"18680f34-6448-42b6-bb6d-c87e7f1becb7","Type":"ContainerStarted","Data":"c6f93e329a26b646d4cc7e4726746b265960dc38ccd16c0423f5445220b463c7"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.386042 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.388497 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-1" event={"ID":"acf747a6-cf73-4b69-a1b7-bf95cde29f63","Type":"ContainerStarted","Data":"8a32e8ceca0dc78eccae297ca01abbd04af5575c2240e86fd48650257d7ad666"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.390947 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" event={"ID":"5bad6865-faaa-4032-b282-4d5a699bd4e2","Type":"ContainerStarted","Data":"58c9008453849bae4fb552a28fddcc866e3916b953be7fedac039be87fb3189a"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.393621 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-2" event={"ID":"178ed054-c2f4-450d-99b8-88fd079432cf","Type":"ContainerStarted","Data":"e79c290de7993dd895e14a6047c6aff6c83a14adf324a67f428c67834fffd08a"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.396661 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/openstack-galera-0" event={"ID":"a9054324-a882-4b95-adda-bde2ea2ab268","Type":"ContainerStarted","Data":"b979020962c3c3f0020a87ed3bc808da4927182ded9c828983c439d019573429"} Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.417863 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/memcached-0" podStartSLOduration=5.961033501 podStartE2EDuration="9.417834289s" podCreationTimestamp="2026-03-07 07:15:34 +0000 UTC" firstStartedPulling="2026-03-07 07:15:37.463607871 +0000 UTC m=+955.928595192" lastFinishedPulling="2026-03-07 07:15:40.920408659 +0000 UTC m=+959.385395980" observedRunningTime="2026-03-07 07:15:43.412657361 +0000 UTC m=+961.877644682" watchObservedRunningTime="2026-03-07 07:15:43.417834289 +0000 UTC m=+961.882821610" Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.449401 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-1" podStartSLOduration=8.747853063 podStartE2EDuration="19.4493737s" podCreationTimestamp="2026-03-07 07:15:24 +0000 UTC" firstStartedPulling="2026-03-07 07:15:26.567847258 +0000 UTC m=+945.032834589" lastFinishedPulling="2026-03-07 07:15:37.269367905 +0000 UTC m=+955.734355226" observedRunningTime="2026-03-07 07:15:43.443239926 +0000 UTC m=+961.908227257" watchObservedRunningTime="2026-03-07 07:15:43.4493737 +0000 UTC m=+961.914361021" Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.489147 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" podStartSLOduration=7.585065209 podStartE2EDuration="12.489122169s" podCreationTimestamp="2026-03-07 07:15:31 +0000 UTC" firstStartedPulling="2026-03-07 07:15:37.603028167 +0000 UTC m=+956.068015488" lastFinishedPulling="2026-03-07 07:15:42.507085127 +0000 UTC m=+960.972072448" observedRunningTime="2026-03-07 07:15:43.464565484 +0000 UTC m=+961.929552805" watchObservedRunningTime="2026-03-07 07:15:43.489122169 +0000 UTC m=+961.954109510" Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.490771 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-2" podStartSLOduration=8.864124332 podStartE2EDuration="19.490762543s" podCreationTimestamp="2026-03-07 07:15:24 +0000 UTC" firstStartedPulling="2026-03-07 07:15:26.562909306 +0000 UTC m=+945.027896647" lastFinishedPulling="2026-03-07 07:15:37.189547537 +0000 UTC m=+955.654534858" observedRunningTime="2026-03-07 07:15:43.486251062 +0000 UTC m=+961.951238403" watchObservedRunningTime="2026-03-07 07:15:43.490762543 +0000 UTC m=+961.955749874" Mar 07 07:15:43 crc kubenswrapper[4738]: I0307 07:15:43.506763 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/openstack-galera-0" podStartSLOduration=8.601840131 podStartE2EDuration="19.506731559s" podCreationTimestamp="2026-03-07 07:15:24 +0000 UTC" firstStartedPulling="2026-03-07 07:15:26.294763619 +0000 UTC m=+944.759750940" lastFinishedPulling="2026-03-07 07:15:37.199655047 +0000 UTC m=+955.664642368" observedRunningTime="2026-03-07 07:15:43.506641935 +0000 UTC m=+961.971629296" watchObservedRunningTime="2026-03-07 07:15:43.506731559 +0000 UTC m=+961.971718880" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.938128 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.939425 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.973756 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.973813 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.981657 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:45 crc kubenswrapper[4738]: I0307 07:15:45.981725 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:49 crc kubenswrapper[4738]: I0307 07:15:49.678809 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/memcached-0" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.048779 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.049329 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.073459 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.099745 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.153473 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-2" Mar 07 07:15:52 crc kubenswrapper[4738]: I0307 07:15:52.499759 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-xmcn5" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.652697 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/root-account-create-update-j7gqh"] Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.654238 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.657114 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"openstack-mariadb-root-db-secret" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.660935 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-j7gqh"] Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.711731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7nznd\" (UniqueName: \"kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.711811 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.812940 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7nznd\" (UniqueName: \"kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.813010 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.813906 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.834427 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7nznd\" (UniqueName: \"kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd\") pod \"root-account-create-update-j7gqh\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:54 crc kubenswrapper[4738]: I0307 07:15:54.987752 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:15:55 crc kubenswrapper[4738]: I0307 07:15:55.453123 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/root-account-create-update-j7gqh"] Mar 07 07:15:55 crc kubenswrapper[4738]: I0307 07:15:55.490590 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-j7gqh" event={"ID":"63ca5884-f8c3-464b-8621-1e97d77e5083","Type":"ContainerStarted","Data":"0874c6a94addb236366125e5459e42a73767fc39c9f7b72e1fa294424ae099e5"} Mar 07 07:15:56 crc kubenswrapper[4738]: I0307 07:15:56.072431 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="swift-kuttl-tests/openstack-galera-2" podUID="178ed054-c2f4-450d-99b8-88fd079432cf" containerName="galera" probeResult="failure" output=< Mar 07 07:15:56 crc kubenswrapper[4738]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Mar 07 07:15:56 crc kubenswrapper[4738]: > Mar 07 07:15:57 crc kubenswrapper[4738]: I0307 07:15:57.506247 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-j7gqh" event={"ID":"63ca5884-f8c3-464b-8621-1e97d77e5083","Type":"ContainerStarted","Data":"de970a9fb6f42b417e39ea65d51a5dd6183c4734303d7d6e8f4e06e625946d55"} Mar 07 07:15:57 crc kubenswrapper[4738]: I0307 07:15:57.524339 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/root-account-create-update-j7gqh" podStartSLOduration=3.524311003 podStartE2EDuration="3.524311003s" podCreationTimestamp="2026-03-07 07:15:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:15:57.523669436 +0000 UTC m=+975.988656757" watchObservedRunningTime="2026-03-07 07:15:57.524311003 +0000 UTC m=+975.989298324" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.127803 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547796-qtj6f"] Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.129123 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.132267 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.132842 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.134888 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-qtj6f"] Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.135896 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.198883 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl2v\" (UniqueName: \"kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v\") pod \"auto-csr-approver-29547796-qtj6f\" (UID: \"077a5b42-3ee0-4fcd-90cd-72122279a352\") " pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.303882 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl2v\" (UniqueName: \"kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v\") pod \"auto-csr-approver-29547796-qtj6f\" (UID: \"077a5b42-3ee0-4fcd-90cd-72122279a352\") " pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.326724 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl2v\" (UniqueName: \"kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v\") pod \"auto-csr-approver-29547796-qtj6f\" (UID: \"077a5b42-3ee0-4fcd-90cd-72122279a352\") " pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.449860 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.532869 4738 generic.go:334] "Generic (PLEG): container finished" podID="63ca5884-f8c3-464b-8621-1e97d77e5083" containerID="de970a9fb6f42b417e39ea65d51a5dd6183c4734303d7d6e8f4e06e625946d55" exitCode=0 Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.532945 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-j7gqh" event={"ID":"63ca5884-f8c3-464b-8621-1e97d77e5083","Type":"ContainerDied","Data":"de970a9fb6f42b417e39ea65d51a5dd6183c4734303d7d6e8f4e06e625946d55"} Mar 07 07:16:00 crc kubenswrapper[4738]: I0307 07:16:00.882419 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-qtj6f"] Mar 07 07:16:00 crc kubenswrapper[4738]: W0307 07:16:00.899436 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod077a5b42_3ee0_4fcd_90cd_72122279a352.slice/crio-3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188 WatchSource:0}: Error finding container 3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188: Status 404 returned error can't find the container with id 3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188 Mar 07 07:16:01 crc kubenswrapper[4738]: I0307 07:16:01.541424 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" event={"ID":"077a5b42-3ee0-4fcd-90cd-72122279a352","Type":"ContainerStarted","Data":"3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188"} Mar 07 07:16:01 crc kubenswrapper[4738]: I0307 07:16:01.884475 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.027618 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7nznd\" (UniqueName: \"kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd\") pod \"63ca5884-f8c3-464b-8621-1e97d77e5083\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.027739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts\") pod \"63ca5884-f8c3-464b-8621-1e97d77e5083\" (UID: \"63ca5884-f8c3-464b-8621-1e97d77e5083\") " Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.028910 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63ca5884-f8c3-464b-8621-1e97d77e5083" (UID: "63ca5884-f8c3-464b-8621-1e97d77e5083"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.048421 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd" (OuterVolumeSpecName: "kube-api-access-7nznd") pod "63ca5884-f8c3-464b-8621-1e97d77e5083" (UID: "63ca5884-f8c3-464b-8621-1e97d77e5083"). InnerVolumeSpecName "kube-api-access-7nznd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.129082 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7nznd\" (UniqueName: \"kubernetes.io/projected/63ca5884-f8c3-464b-8621-1e97d77e5083-kube-api-access-7nznd\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.129524 4738 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63ca5884-f8c3-464b-8621-1e97d77e5083-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.444384 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.530992 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-0" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.548031 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/root-account-create-update-j7gqh" event={"ID":"63ca5884-f8c3-464b-8621-1e97d77e5083","Type":"ContainerDied","Data":"0874c6a94addb236366125e5459e42a73767fc39c9f7b72e1fa294424ae099e5"} Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.548065 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/root-account-create-update-j7gqh" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.548083 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0874c6a94addb236366125e5459e42a73767fc39c9f7b72e1fa294424ae099e5" Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.550922 4738 generic.go:334] "Generic (PLEG): container finished" podID="077a5b42-3ee0-4fcd-90cd-72122279a352" containerID="bb3aeb5ab513d22a30fae4da917954a5069995ddc59fe2494f40016ad41743aa" exitCode=0 Mar 07 07:16:02 crc kubenswrapper[4738]: I0307 07:16:02.550969 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" event={"ID":"077a5b42-3ee0-4fcd-90cd-72122279a352","Type":"ContainerDied","Data":"bb3aeb5ab513d22a30fae4da917954a5069995ddc59fe2494f40016ad41743aa"} Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.339410 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq"] Mar 07 07:16:03 crc kubenswrapper[4738]: E0307 07:16:03.340125 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ca5884-f8c3-464b-8621-1e97d77e5083" containerName="mariadb-account-create-update" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.340179 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ca5884-f8c3-464b-8621-1e97d77e5083" containerName="mariadb-account-create-update" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.340391 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ca5884-f8c3-464b-8621-1e97d77e5083" containerName="mariadb-account-create-update" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.361580 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.363689 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq"] Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.364937 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.458744 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7rgx\" (UniqueName: \"kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.458829 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.458858 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.560368 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.560435 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.560604 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7rgx\" (UniqueName: \"kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.561253 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.561365 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.593803 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7rgx\" (UniqueName: \"kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.684493 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:03 crc kubenswrapper[4738]: I0307 07:16:03.909054 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.071059 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl2v\" (UniqueName: \"kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v\") pod \"077a5b42-3ee0-4fcd-90cd-72122279a352\" (UID: \"077a5b42-3ee0-4fcd-90cd-72122279a352\") " Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.078254 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v" (OuterVolumeSpecName: "kube-api-access-9hl2v") pod "077a5b42-3ee0-4fcd-90cd-72122279a352" (UID: "077a5b42-3ee0-4fcd-90cd-72122279a352"). InnerVolumeSpecName "kube-api-access-9hl2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.146172 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq"] Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.173448 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl2v\" (UniqueName: \"kubernetes.io/projected/077a5b42-3ee0-4fcd-90cd-72122279a352-kube-api-access-9hl2v\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.575077 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerStarted","Data":"f3daba2e7ec390d168c83ce7baea52b463b882f8d1edd93c9ad6e4c3d9c3ecd3"} Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.575857 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerStarted","Data":"b3e5892b9e210e2550d4645b7476b93dab7f6b0d4c5d24f25b98b83f00acc921"} Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.580337 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" event={"ID":"077a5b42-3ee0-4fcd-90cd-72122279a352","Type":"ContainerDied","Data":"3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188"} Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.580411 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e87b32e0328b643ff910019f96cba7fa54cf4836b7ddc60b5100f21f803a188" Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.580496 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547796-qtj6f" Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.973409 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-lp9mz"] Mar 07 07:16:04 crc kubenswrapper[4738]: I0307 07:16:04.979273 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547790-lp9mz"] Mar 07 07:16:05 crc kubenswrapper[4738]: I0307 07:16:05.588188 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerDied","Data":"f3daba2e7ec390d168c83ce7baea52b463b882f8d1edd93c9ad6e4c3d9c3ecd3"} Mar 07 07:16:05 crc kubenswrapper[4738]: I0307 07:16:05.588980 4738 generic.go:334] "Generic (PLEG): container finished" podID="32694363-7ac1-464f-a71a-142236e5eab8" containerID="f3daba2e7ec390d168c83ce7baea52b463b882f8d1edd93c9ad6e4c3d9c3ecd3" exitCode=0 Mar 07 07:16:06 crc kubenswrapper[4738]: I0307 07:16:06.403590 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93a749f-62f3-4014-9e01-dad1c1225fc1" path="/var/lib/kubelet/pods/d93a749f-62f3-4014-9e01-dad1c1225fc1/volumes" Mar 07 07:16:06 crc kubenswrapper[4738]: I0307 07:16:06.570630 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:16:06 crc kubenswrapper[4738]: I0307 07:16:06.602726 4738 generic.go:334] "Generic (PLEG): container finished" podID="32694363-7ac1-464f-a71a-142236e5eab8" containerID="ba52aa3684789ec89b98e39de0ccb9c29f17285e80f75126919ebcfba8c29037" exitCode=0 Mar 07 07:16:06 crc kubenswrapper[4738]: I0307 07:16:06.602787 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerDied","Data":"ba52aa3684789ec89b98e39de0ccb9c29f17285e80f75126919ebcfba8c29037"} Mar 07 07:16:06 crc kubenswrapper[4738]: I0307 07:16:06.667989 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/openstack-galera-1" Mar 07 07:16:07 crc kubenswrapper[4738]: I0307 07:16:07.636665 4738 generic.go:334] "Generic (PLEG): container finished" podID="32694363-7ac1-464f-a71a-142236e5eab8" containerID="a6e28d666eaecca1f2d0eec07d4b6a9cb2d085b249f8ee466964190c72e368c6" exitCode=0 Mar 07 07:16:07 crc kubenswrapper[4738]: I0307 07:16:07.636728 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerDied","Data":"a6e28d666eaecca1f2d0eec07d4b6a9cb2d085b249f8ee466964190c72e368c6"} Mar 07 07:16:08 crc kubenswrapper[4738]: I0307 07:16:08.971988 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.155405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle\") pod \"32694363-7ac1-464f-a71a-142236e5eab8\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.155789 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util\") pod \"32694363-7ac1-464f-a71a-142236e5eab8\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.155922 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7rgx\" (UniqueName: \"kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx\") pod \"32694363-7ac1-464f-a71a-142236e5eab8\" (UID: \"32694363-7ac1-464f-a71a-142236e5eab8\") " Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.156720 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle" (OuterVolumeSpecName: "bundle") pod "32694363-7ac1-464f-a71a-142236e5eab8" (UID: "32694363-7ac1-464f-a71a-142236e5eab8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.162448 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx" (OuterVolumeSpecName: "kube-api-access-k7rgx") pod "32694363-7ac1-464f-a71a-142236e5eab8" (UID: "32694363-7ac1-464f-a71a-142236e5eab8"). InnerVolumeSpecName "kube-api-access-k7rgx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.177356 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util" (OuterVolumeSpecName: "util") pod "32694363-7ac1-464f-a71a-142236e5eab8" (UID: "32694363-7ac1-464f-a71a-142236e5eab8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.258239 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7rgx\" (UniqueName: \"kubernetes.io/projected/32694363-7ac1-464f-a71a-142236e5eab8-kube-api-access-k7rgx\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.258291 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.258303 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32694363-7ac1-464f-a71a-142236e5eab8-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.654149 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" event={"ID":"32694363-7ac1-464f-a71a-142236e5eab8","Type":"ContainerDied","Data":"b3e5892b9e210e2550d4645b7476b93dab7f6b0d4c5d24f25b98b83f00acc921"} Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.654209 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3e5892b9e210e2550d4645b7476b93dab7f6b0d4c5d24f25b98b83f00acc921" Mar 07 07:16:09 crc kubenswrapper[4738]: I0307 07:16:09.654239 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.301717 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:12 crc kubenswrapper[4738]: E0307 07:16:12.302538 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="pull" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302558 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="pull" Mar 07 07:16:12 crc kubenswrapper[4738]: E0307 07:16:12.302578 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="extract" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302585 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="extract" Mar 07 07:16:12 crc kubenswrapper[4738]: E0307 07:16:12.302597 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077a5b42-3ee0-4fcd-90cd-72122279a352" containerName="oc" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302604 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="077a5b42-3ee0-4fcd-90cd-72122279a352" containerName="oc" Mar 07 07:16:12 crc kubenswrapper[4738]: E0307 07:16:12.302614 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="util" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302620 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="util" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302756 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="32694363-7ac1-464f-a71a-142236e5eab8" containerName="extract" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.302775 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="077a5b42-3ee0-4fcd-90cd-72122279a352" containerName="oc" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.303753 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.322651 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.408190 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.408393 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.408504 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqfq\" (UniqueName: \"kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.510487 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.510568 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.510597 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqfq\" (UniqueName: \"kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.511690 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.512227 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.534368 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqfq\" (UniqueName: \"kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq\") pod \"redhat-operators-wgkkv\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.624336 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:12 crc kubenswrapper[4738]: I0307 07:16:12.869209 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:13 crc kubenswrapper[4738]: I0307 07:16:13.690612 4738 generic.go:334] "Generic (PLEG): container finished" podID="65a174fb-941f-49cf-b45c-f0911b621231" containerID="5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e" exitCode=0 Mar 07 07:16:13 crc kubenswrapper[4738]: I0307 07:16:13.690710 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerDied","Data":"5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e"} Mar 07 07:16:13 crc kubenswrapper[4738]: I0307 07:16:13.690991 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerStarted","Data":"6dc50d170a0a987541e224f12a645f1b4c6af87681233af02fe3d3d8bfe6508c"} Mar 07 07:16:14 crc kubenswrapper[4738]: I0307 07:16:14.697127 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerStarted","Data":"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf"} Mar 07 07:16:15 crc kubenswrapper[4738]: I0307 07:16:15.705263 4738 generic.go:334] "Generic (PLEG): container finished" podID="65a174fb-941f-49cf-b45c-f0911b621231" containerID="02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf" exitCode=0 Mar 07 07:16:15 crc kubenswrapper[4738]: I0307 07:16:15.705401 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerDied","Data":"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf"} Mar 07 07:16:15 crc kubenswrapper[4738]: I0307 07:16:15.894998 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:15 crc kubenswrapper[4738]: I0307 07:16:15.896516 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:15 crc kubenswrapper[4738]: I0307 07:16:15.909403 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.070444 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmhpf\" (UniqueName: \"kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.070496 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.070573 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.174177 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.174578 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmhpf\" (UniqueName: \"kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.174620 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.174921 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.175083 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.202917 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmhpf\" (UniqueName: \"kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf\") pod \"certified-operators-6gkrg\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.210344 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.459767 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.713286 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerStarted","Data":"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af"} Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.715321 4738 generic.go:334] "Generic (PLEG): container finished" podID="be13512b-82bf-443f-b0a5-0913e77b0099" containerID="385f5372212c9389064afd0c055aa2c02cede5b1c5a161860ea5fd7420b89ccc" exitCode=0 Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.715366 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerDied","Data":"385f5372212c9389064afd0c055aa2c02cede5b1c5a161860ea5fd7420b89ccc"} Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.715393 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerStarted","Data":"2cf3fe7059a91146c902a776d85af0ce627a308167d7cf951457089bc3ab1f76"} Mar 07 07:16:16 crc kubenswrapper[4738]: I0307 07:16:16.730736 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-wgkkv" podStartSLOduration=2.334868865 podStartE2EDuration="4.730716307s" podCreationTimestamp="2026-03-07 07:16:12 +0000 UTC" firstStartedPulling="2026-03-07 07:16:13.692371032 +0000 UTC m=+992.157358353" lastFinishedPulling="2026-03-07 07:16:16.088218474 +0000 UTC m=+994.553205795" observedRunningTime="2026-03-07 07:16:16.729104954 +0000 UTC m=+995.194092275" watchObservedRunningTime="2026-03-07 07:16:16.730716307 +0000 UTC m=+995.195703628" Mar 07 07:16:17 crc kubenswrapper[4738]: I0307 07:16:17.722613 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerStarted","Data":"49e89c647eb2e4e2cbbeab8df8c524d6d97f87b7d90073c483553f370700dbd7"} Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.607144 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm"] Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.608944 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.611785 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-k2m96" Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.624405 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm"] Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.713120 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcpzf\" (UniqueName: \"kubernetes.io/projected/fb842fb6-8fb7-4601-a0be-bab337f47d4a-kube-api-access-vcpzf\") pod \"rabbitmq-cluster-operator-779fc9694b-nbhcm\" (UID: \"fb842fb6-8fb7-4601-a0be-bab337f47d4a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.731098 4738 generic.go:334] "Generic (PLEG): container finished" podID="be13512b-82bf-443f-b0a5-0913e77b0099" containerID="49e89c647eb2e4e2cbbeab8df8c524d6d97f87b7d90073c483553f370700dbd7" exitCode=0 Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.731453 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerDied","Data":"49e89c647eb2e4e2cbbeab8df8c524d6d97f87b7d90073c483553f370700dbd7"} Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.814791 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcpzf\" (UniqueName: \"kubernetes.io/projected/fb842fb6-8fb7-4601-a0be-bab337f47d4a-kube-api-access-vcpzf\") pod \"rabbitmq-cluster-operator-779fc9694b-nbhcm\" (UID: \"fb842fb6-8fb7-4601-a0be-bab337f47d4a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.841246 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcpzf\" (UniqueName: \"kubernetes.io/projected/fb842fb6-8fb7-4601-a0be-bab337f47d4a-kube-api-access-vcpzf\") pod \"rabbitmq-cluster-operator-779fc9694b-nbhcm\" (UID: \"fb842fb6-8fb7-4601-a0be-bab337f47d4a\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" Mar 07 07:16:18 crc kubenswrapper[4738]: I0307 07:16:18.934244 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" Mar 07 07:16:19 crc kubenswrapper[4738]: I0307 07:16:19.187290 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm"] Mar 07 07:16:19 crc kubenswrapper[4738]: I0307 07:16:19.740631 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" event={"ID":"fb842fb6-8fb7-4601-a0be-bab337f47d4a","Type":"ContainerStarted","Data":"33972399b495f8323ecdd28dd44114b11aea2e6ae23c6f6529f327902c381903"} Mar 07 07:16:19 crc kubenswrapper[4738]: I0307 07:16:19.743645 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerStarted","Data":"464a9e08a168e2676df101ea966b192229aee37b99e5173eebe244dddff43558"} Mar 07 07:16:19 crc kubenswrapper[4738]: I0307 07:16:19.778893 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6gkrg" podStartSLOduration=2.352489108 podStartE2EDuration="4.778852383s" podCreationTimestamp="2026-03-07 07:16:15 +0000 UTC" firstStartedPulling="2026-03-07 07:16:16.716432306 +0000 UTC m=+995.181419627" lastFinishedPulling="2026-03-07 07:16:19.142795571 +0000 UTC m=+997.607782902" observedRunningTime="2026-03-07 07:16:19.774566259 +0000 UTC m=+998.239553600" watchObservedRunningTime="2026-03-07 07:16:19.778852383 +0000 UTC m=+998.243839714" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.699605 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.701525 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.713653 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.860964 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.861029 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.861060 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hst7x\" (UniqueName: \"kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.962196 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.962255 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.962295 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hst7x\" (UniqueName: \"kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.963289 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:21 crc kubenswrapper[4738]: I0307 07:16:21.963586 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:22 crc kubenswrapper[4738]: I0307 07:16:21.992023 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hst7x\" (UniqueName: \"kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x\") pod \"redhat-marketplace-8ggg7\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:22 crc kubenswrapper[4738]: I0307 07:16:22.035784 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:22 crc kubenswrapper[4738]: I0307 07:16:22.625104 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:22 crc kubenswrapper[4738]: I0307 07:16:22.625816 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:23 crc kubenswrapper[4738]: I0307 07:16:23.680594 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-wgkkv" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="registry-server" probeResult="failure" output=< Mar 07 07:16:23 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:16:23 crc kubenswrapper[4738]: > Mar 07 07:16:23 crc kubenswrapper[4738]: I0307 07:16:23.867541 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:23 crc kubenswrapper[4738]: W0307 07:16:23.874700 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d2740c_c4da_4ae1_9788_3cfb9827d312.slice/crio-2c1fac814274975dc9a5ce680f4fa9e4f460b57c9ea94b5881de3248fd52ee8c WatchSource:0}: Error finding container 2c1fac814274975dc9a5ce680f4fa9e4f460b57c9ea94b5881de3248fd52ee8c: Status 404 returned error can't find the container with id 2c1fac814274975dc9a5ce680f4fa9e4f460b57c9ea94b5881de3248fd52ee8c Mar 07 07:16:24 crc kubenswrapper[4738]: E0307 07:16:24.336452 4738 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d2740c_c4da_4ae1_9788_3cfb9827d312.slice/crio-conmon-b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52d2740c_c4da_4ae1_9788_3cfb9827d312.slice/crio-b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:16:24 crc kubenswrapper[4738]: I0307 07:16:24.790609 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" event={"ID":"fb842fb6-8fb7-4601-a0be-bab337f47d4a","Type":"ContainerStarted","Data":"17bd1a66feafc67646c5ed075eb443f57145fe62952ca0617a1468d1cc8e831d"} Mar 07 07:16:24 crc kubenswrapper[4738]: I0307 07:16:24.792711 4738 generic.go:334] "Generic (PLEG): container finished" podID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerID="b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627" exitCode=0 Mar 07 07:16:24 crc kubenswrapper[4738]: I0307 07:16:24.792761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerDied","Data":"b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627"} Mar 07 07:16:24 crc kubenswrapper[4738]: I0307 07:16:24.792788 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerStarted","Data":"2c1fac814274975dc9a5ce680f4fa9e4f460b57c9ea94b5881de3248fd52ee8c"} Mar 07 07:16:24 crc kubenswrapper[4738]: I0307 07:16:24.875075 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-nbhcm" podStartSLOduration=2.381603098 podStartE2EDuration="6.875046574s" podCreationTimestamp="2026-03-07 07:16:18 +0000 UTC" firstStartedPulling="2026-03-07 07:16:19.215887939 +0000 UTC m=+997.680875250" lastFinishedPulling="2026-03-07 07:16:23.709331405 +0000 UTC m=+1002.174318726" observedRunningTime="2026-03-07 07:16:24.854976368 +0000 UTC m=+1003.319963689" watchObservedRunningTime="2026-03-07 07:16:24.875046574 +0000 UTC m=+1003.340033905" Mar 07 07:16:25 crc kubenswrapper[4738]: I0307 07:16:25.802973 4738 generic.go:334] "Generic (PLEG): container finished" podID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerID="45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005" exitCode=0 Mar 07 07:16:25 crc kubenswrapper[4738]: I0307 07:16:25.803060 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerDied","Data":"45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005"} Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.210606 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.210658 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.260195 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.815054 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerStarted","Data":"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2"} Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.836775 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ggg7" podStartSLOduration=4.419896986 podStartE2EDuration="5.836753406s" podCreationTimestamp="2026-03-07 07:16:21 +0000 UTC" firstStartedPulling="2026-03-07 07:16:24.796888391 +0000 UTC m=+1003.261875722" lastFinishedPulling="2026-03-07 07:16:26.213744811 +0000 UTC m=+1004.678732142" observedRunningTime="2026-03-07 07:16:26.831459964 +0000 UTC m=+1005.296447285" watchObservedRunningTime="2026-03-07 07:16:26.836753406 +0000 UTC m=+1005.301740727" Mar 07 07:16:26 crc kubenswrapper[4738]: I0307 07:16:26.867859 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.198299 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.199696 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.204615 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-default-user" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.204900 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-server-conf" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.205217 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-server-dockercfg-s8wxs" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.209546 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"rabbitmq-plugins-conf" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.210009 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"rabbitmq-erlang-cookie" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.215015 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.368509 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-25b83e75-cf03-431a-8ac3-f6220d421049\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25b83e75-cf03-431a-8ac3-f6220d421049\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.368573 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.368601 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f61d3e1-a31b-4e55-8826-0613669b010c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.368638 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.368951 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f61d3e1-a31b-4e55-8826-0613669b010c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.369050 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.369308 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfpsj\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-kube-api-access-pfpsj\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.369455 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f61d3e1-a31b-4e55-8826-0613669b010c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.470896 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfpsj\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-kube-api-access-pfpsj\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.470999 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f61d3e1-a31b-4e55-8826-0613669b010c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471065 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-25b83e75-cf03-431a-8ac3-f6220d421049\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25b83e75-cf03-431a-8ac3-f6220d421049\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471108 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471140 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f61d3e1-a31b-4e55-8826-0613669b010c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471223 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471327 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f61d3e1-a31b-4e55-8826-0613669b010c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.471371 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.472186 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/3f61d3e1-a31b-4e55-8826-0613669b010c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.472417 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.472461 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.478640 4738 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.478677 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-25b83e75-cf03-431a-8ac3-f6220d421049\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25b83e75-cf03-431a-8ac3-f6220d421049\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b82cea504058927728e2435bbeeafcd8348240d0ad689642bc952fecad44a11d/globalmount\"" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.481636 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/3f61d3e1-a31b-4e55-8826-0613669b010c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.485010 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.486549 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/3f61d3e1-a31b-4e55-8826-0613669b010c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.498291 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfpsj\" (UniqueName: \"kubernetes.io/projected/3f61d3e1-a31b-4e55-8826-0613669b010c-kube-api-access-pfpsj\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.521308 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-25b83e75-cf03-431a-8ac3-f6220d421049\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-25b83e75-cf03-431a-8ac3-f6220d421049\") pod \"rabbitmq-server-0\" (UID: \"3f61d3e1-a31b-4e55-8826-0613669b010c\") " pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:27 crc kubenswrapper[4738]: I0307 07:16:27.819649 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:16:28 crc kubenswrapper[4738]: I0307 07:16:28.100847 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/rabbitmq-server-0"] Mar 07 07:16:28 crc kubenswrapper[4738]: I0307 07:16:28.827853 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"3f61d3e1-a31b-4e55-8826-0613669b010c","Type":"ContainerStarted","Data":"83c69376e8a0736052f74668fb1576d665a771665499f7451fb93a74499216b3"} Mar 07 07:16:29 crc kubenswrapper[4738]: I0307 07:16:29.911490 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-qg2d2"] Mar 07 07:16:29 crc kubenswrapper[4738]: I0307 07:16:29.912588 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:29 crc kubenswrapper[4738]: I0307 07:16:29.916784 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-xkvs2" Mar 07 07:16:29 crc kubenswrapper[4738]: I0307 07:16:29.918264 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-qg2d2"] Mar 07 07:16:29 crc kubenswrapper[4738]: I0307 07:16:29.924653 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7p7q\" (UniqueName: \"kubernetes.io/projected/1d4db5b1-0bd5-45dd-a926-40d569a6eeb5-kube-api-access-c7p7q\") pod \"keystone-operator-index-qg2d2\" (UID: \"1d4db5b1-0bd5-45dd-a926-40d569a6eeb5\") " pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:30 crc kubenswrapper[4738]: I0307 07:16:30.025732 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7p7q\" (UniqueName: \"kubernetes.io/projected/1d4db5b1-0bd5-45dd-a926-40d569a6eeb5-kube-api-access-c7p7q\") pod \"keystone-operator-index-qg2d2\" (UID: \"1d4db5b1-0bd5-45dd-a926-40d569a6eeb5\") " pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:30 crc kubenswrapper[4738]: I0307 07:16:30.050478 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7p7q\" (UniqueName: \"kubernetes.io/projected/1d4db5b1-0bd5-45dd-a926-40d569a6eeb5-kube-api-access-c7p7q\") pod \"keystone-operator-index-qg2d2\" (UID: \"1d4db5b1-0bd5-45dd-a926-40d569a6eeb5\") " pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:30 crc kubenswrapper[4738]: I0307 07:16:30.239985 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:31 crc kubenswrapper[4738]: I0307 07:16:31.865943 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-qg2d2"] Mar 07 07:16:31 crc kubenswrapper[4738]: W0307 07:16:31.872136 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4db5b1_0bd5_45dd_a926_40d569a6eeb5.slice/crio-f919eefa607a7ec1688cb5f2a5802d58ecd2fece36796daeb3e332447db61f24 WatchSource:0}: Error finding container f919eefa607a7ec1688cb5f2a5802d58ecd2fece36796daeb3e332447db61f24: Status 404 returned error can't find the container with id f919eefa607a7ec1688cb5f2a5802d58ecd2fece36796daeb3e332447db61f24 Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.036265 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.036371 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.111457 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.668551 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.687499 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.687824 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6gkrg" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="registry-server" containerID="cri-o://464a9e08a168e2676df101ea966b192229aee37b99e5173eebe244dddff43558" gracePeriod=2 Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.727592 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.860498 4738 generic.go:334] "Generic (PLEG): container finished" podID="be13512b-82bf-443f-b0a5-0913e77b0099" containerID="464a9e08a168e2676df101ea966b192229aee37b99e5173eebe244dddff43558" exitCode=0 Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.860581 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerDied","Data":"464a9e08a168e2676df101ea966b192229aee37b99e5173eebe244dddff43558"} Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.861979 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-qg2d2" event={"ID":"1d4db5b1-0bd5-45dd-a926-40d569a6eeb5","Type":"ContainerStarted","Data":"f919eefa607a7ec1688cb5f2a5802d58ecd2fece36796daeb3e332447db61f24"} Mar 07 07:16:32 crc kubenswrapper[4738]: I0307 07:16:32.901332 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.154925 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.296476 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content\") pod \"be13512b-82bf-443f-b0a5-0913e77b0099\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.296558 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmhpf\" (UniqueName: \"kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf\") pod \"be13512b-82bf-443f-b0a5-0913e77b0099\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.296627 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities\") pod \"be13512b-82bf-443f-b0a5-0913e77b0099\" (UID: \"be13512b-82bf-443f-b0a5-0913e77b0099\") " Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.298402 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities" (OuterVolumeSpecName: "utilities") pod "be13512b-82bf-443f-b0a5-0913e77b0099" (UID: "be13512b-82bf-443f-b0a5-0913e77b0099"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.312628 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf" (OuterVolumeSpecName: "kube-api-access-lmhpf") pod "be13512b-82bf-443f-b0a5-0913e77b0099" (UID: "be13512b-82bf-443f-b0a5-0913e77b0099"). InnerVolumeSpecName "kube-api-access-lmhpf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.370419 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "be13512b-82bf-443f-b0a5-0913e77b0099" (UID: "be13512b-82bf-443f-b0a5-0913e77b0099"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.400347 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.400391 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmhpf\" (UniqueName: \"kubernetes.io/projected/be13512b-82bf-443f-b0a5-0913e77b0099-kube-api-access-lmhpf\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.400406 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/be13512b-82bf-443f-b0a5-0913e77b0099-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:34 crc kubenswrapper[4738]: E0307 07:16:34.525148 4738 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe13512b_82bf_443f_b0a5_0913e77b0099.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe13512b_82bf_443f_b0a5_0913e77b0099.slice/crio-2cf3fe7059a91146c902a776d85af0ce627a308167d7cf951457089bc3ab1f76\": RecentStats: unable to find data in memory cache]" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.888091 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6gkrg" event={"ID":"be13512b-82bf-443f-b0a5-0913e77b0099","Type":"ContainerDied","Data":"2cf3fe7059a91146c902a776d85af0ce627a308167d7cf951457089bc3ab1f76"} Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.888190 4738 scope.go:117] "RemoveContainer" containerID="464a9e08a168e2676df101ea966b192229aee37b99e5173eebe244dddff43558" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.888183 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6gkrg" Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.911786 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:34 crc kubenswrapper[4738]: I0307 07:16:34.916470 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6gkrg"] Mar 07 07:16:35 crc kubenswrapper[4738]: I0307 07:16:35.248867 4738 scope.go:117] "RemoveContainer" containerID="49e89c647eb2e4e2cbbeab8df8c524d6d97f87b7d90073c483553f370700dbd7" Mar 07 07:16:35 crc kubenswrapper[4738]: I0307 07:16:35.696422 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:35 crc kubenswrapper[4738]: I0307 07:16:35.896049 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ggg7" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="registry-server" containerID="cri-o://7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2" gracePeriod=2 Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.007631 4738 scope.go:117] "RemoveContainer" containerID="385f5372212c9389064afd0c055aa2c02cede5b1c5a161860ea5fd7420b89ccc" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.348551 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.402453 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" path="/var/lib/kubelet/pods/be13512b-82bf-443f-b0a5-0913e77b0099/volumes" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.439753 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content\") pod \"52d2740c-c4da-4ae1-9788-3cfb9827d312\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.439864 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hst7x\" (UniqueName: \"kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x\") pod \"52d2740c-c4da-4ae1-9788-3cfb9827d312\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.439906 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities\") pod \"52d2740c-c4da-4ae1-9788-3cfb9827d312\" (UID: \"52d2740c-c4da-4ae1-9788-3cfb9827d312\") " Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.441312 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities" (OuterVolumeSpecName: "utilities") pod "52d2740c-c4da-4ae1-9788-3cfb9827d312" (UID: "52d2740c-c4da-4ae1-9788-3cfb9827d312"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.447431 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x" (OuterVolumeSpecName: "kube-api-access-hst7x") pod "52d2740c-c4da-4ae1-9788-3cfb9827d312" (UID: "52d2740c-c4da-4ae1-9788-3cfb9827d312"). InnerVolumeSpecName "kube-api-access-hst7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.478184 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "52d2740c-c4da-4ae1-9788-3cfb9827d312" (UID: "52d2740c-c4da-4ae1-9788-3cfb9827d312"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.541098 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.541138 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hst7x\" (UniqueName: \"kubernetes.io/projected/52d2740c-c4da-4ae1-9788-3cfb9827d312-kube-api-access-hst7x\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.541156 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/52d2740c-c4da-4ae1-9788-3cfb9827d312-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.903495 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-qg2d2" event={"ID":"1d4db5b1-0bd5-45dd-a926-40d569a6eeb5","Type":"ContainerStarted","Data":"427f9596c7ba837073b37802e782a7436088f98fcedacd12a60719ce61806eab"} Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.906813 4738 generic.go:334] "Generic (PLEG): container finished" podID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerID="7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2" exitCode=0 Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.906877 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerDied","Data":"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2"} Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.906902 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ggg7" event={"ID":"52d2740c-c4da-4ae1-9788-3cfb9827d312","Type":"ContainerDied","Data":"2c1fac814274975dc9a5ce680f4fa9e4f460b57c9ea94b5881de3248fd52ee8c"} Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.906922 4738 scope.go:117] "RemoveContainer" containerID="7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.906999 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ggg7" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.926627 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-qg2d2" podStartSLOduration=3.811076319 podStartE2EDuration="7.926607453s" podCreationTimestamp="2026-03-07 07:16:29 +0000 UTC" firstStartedPulling="2026-03-07 07:16:31.877355293 +0000 UTC m=+1010.342342654" lastFinishedPulling="2026-03-07 07:16:35.992886467 +0000 UTC m=+1014.457873788" observedRunningTime="2026-03-07 07:16:36.920758187 +0000 UTC m=+1015.385745518" watchObservedRunningTime="2026-03-07 07:16:36.926607453 +0000 UTC m=+1015.391594784" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.930725 4738 scope.go:117] "RemoveContainer" containerID="45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.941919 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.948719 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ggg7"] Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.960873 4738 scope.go:117] "RemoveContainer" containerID="b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.976204 4738 scope.go:117] "RemoveContainer" containerID="7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2" Mar 07 07:16:36 crc kubenswrapper[4738]: E0307 07:16:36.976604 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2\": container with ID starting with 7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2 not found: ID does not exist" containerID="7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.976642 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2"} err="failed to get container status \"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2\": rpc error: code = NotFound desc = could not find container \"7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2\": container with ID starting with 7072a2ab0a86d76d65b45cb30ff4e04b7bfa9d727b7c2f9d39eda1d585b98fb2 not found: ID does not exist" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.976666 4738 scope.go:117] "RemoveContainer" containerID="45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005" Mar 07 07:16:36 crc kubenswrapper[4738]: E0307 07:16:36.977088 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005\": container with ID starting with 45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005 not found: ID does not exist" containerID="45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.977131 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005"} err="failed to get container status \"45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005\": rpc error: code = NotFound desc = could not find container \"45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005\": container with ID starting with 45a8ba7970f18f0497e9265417f3fd016acbcb44acc4326b8f0ad053daa08005 not found: ID does not exist" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.977176 4738 scope.go:117] "RemoveContainer" containerID="b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627" Mar 07 07:16:36 crc kubenswrapper[4738]: E0307 07:16:36.977791 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627\": container with ID starting with b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627 not found: ID does not exist" containerID="b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627" Mar 07 07:16:36 crc kubenswrapper[4738]: I0307 07:16:36.977834 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627"} err="failed to get container status \"b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627\": rpc error: code = NotFound desc = could not find container \"b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627\": container with ID starting with b6c0801973c127e39a8057733b20f727e9d9b9a5a4b265a992184815e33bb627 not found: ID does not exist" Mar 07 07:16:37 crc kubenswrapper[4738]: I0307 07:16:37.924983 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"3f61d3e1-a31b-4e55-8826-0613669b010c","Type":"ContainerStarted","Data":"96fb6a3051ecf8b552e0ae49820ceac6054d137eb99d0cc6624168364e9c4aff"} Mar 07 07:16:38 crc kubenswrapper[4738]: I0307 07:16:38.393591 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" path="/var/lib/kubelet/pods/52d2740c-c4da-4ae1-9788-3cfb9827d312/volumes" Mar 07 07:16:40 crc kubenswrapper[4738]: I0307 07:16:40.241041 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:40 crc kubenswrapper[4738]: I0307 07:16:40.241425 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:40 crc kubenswrapper[4738]: I0307 07:16:40.269024 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.087899 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.088137 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wgkkv" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="registry-server" containerID="cri-o://7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af" gracePeriod=2 Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.546136 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.616825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content\") pod \"65a174fb-941f-49cf-b45c-f0911b621231\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.616957 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtqfq\" (UniqueName: \"kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq\") pod \"65a174fb-941f-49cf-b45c-f0911b621231\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.617009 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities\") pod \"65a174fb-941f-49cf-b45c-f0911b621231\" (UID: \"65a174fb-941f-49cf-b45c-f0911b621231\") " Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.617990 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities" (OuterVolumeSpecName: "utilities") pod "65a174fb-941f-49cf-b45c-f0911b621231" (UID: "65a174fb-941f-49cf-b45c-f0911b621231"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.623442 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq" (OuterVolumeSpecName: "kube-api-access-jtqfq") pod "65a174fb-941f-49cf-b45c-f0911b621231" (UID: "65a174fb-941f-49cf-b45c-f0911b621231"). InnerVolumeSpecName "kube-api-access-jtqfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.718559 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtqfq\" (UniqueName: \"kubernetes.io/projected/65a174fb-941f-49cf-b45c-f0911b621231-kube-api-access-jtqfq\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.718594 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.748994 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "65a174fb-941f-49cf-b45c-f0911b621231" (UID: "65a174fb-941f-49cf-b45c-f0911b621231"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.820043 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/65a174fb-941f-49cf-b45c-f0911b621231-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.968520 4738 generic.go:334] "Generic (PLEG): container finished" podID="65a174fb-941f-49cf-b45c-f0911b621231" containerID="7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af" exitCode=0 Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.968569 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerDied","Data":"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af"} Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.968597 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wgkkv" event={"ID":"65a174fb-941f-49cf-b45c-f0911b621231","Type":"ContainerDied","Data":"6dc50d170a0a987541e224f12a645f1b4c6af87681233af02fe3d3d8bfe6508c"} Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.968615 4738 scope.go:117] "RemoveContainer" containerID="7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.968612 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wgkkv" Mar 07 07:16:41 crc kubenswrapper[4738]: I0307 07:16:41.985866 4738 scope.go:117] "RemoveContainer" containerID="02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.012228 4738 scope.go:117] "RemoveContainer" containerID="5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.014186 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.019240 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wgkkv"] Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.035154 4738 scope.go:117] "RemoveContainer" containerID="7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af" Mar 07 07:16:42 crc kubenswrapper[4738]: E0307 07:16:42.035602 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af\": container with ID starting with 7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af not found: ID does not exist" containerID="7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.035643 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af"} err="failed to get container status \"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af\": rpc error: code = NotFound desc = could not find container \"7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af\": container with ID starting with 7bacf8f301aabff4aee61af6e2ea30dc74ba74c02033080af10e66ee433ef9af not found: ID does not exist" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.035670 4738 scope.go:117] "RemoveContainer" containerID="02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf" Mar 07 07:16:42 crc kubenswrapper[4738]: E0307 07:16:42.036080 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf\": container with ID starting with 02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf not found: ID does not exist" containerID="02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.036108 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf"} err="failed to get container status \"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf\": rpc error: code = NotFound desc = could not find container \"02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf\": container with ID starting with 02545c63768a26d5e39c30e2809de1eb9856883e4965dc768f3f3d8ab88d8fdf not found: ID does not exist" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.036123 4738 scope.go:117] "RemoveContainer" containerID="5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e" Mar 07 07:16:42 crc kubenswrapper[4738]: E0307 07:16:42.036436 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e\": container with ID starting with 5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e not found: ID does not exist" containerID="5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.036465 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e"} err="failed to get container status \"5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e\": rpc error: code = NotFound desc = could not find container \"5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e\": container with ID starting with 5cc6dd5a21c1497f51d6f4f4e406aa0c03ddd8b063cdfc2d7e2b7fa9f5dfb30e not found: ID does not exist" Mar 07 07:16:42 crc kubenswrapper[4738]: I0307 07:16:42.394193 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a174fb-941f-49cf-b45c-f0911b621231" path="/var/lib/kubelet/pods/65a174fb-941f-49cf-b45c-f0911b621231/volumes" Mar 07 07:16:43 crc kubenswrapper[4738]: I0307 07:16:43.898508 4738 scope.go:117] "RemoveContainer" containerID="4fbdf0ecdd2715af02dc4218a4dfdda6a1324de3b79d4d9faead8fb2c675d9bf" Mar 07 07:16:50 crc kubenswrapper[4738]: I0307 07:16:50.281859 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-qg2d2" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.156133 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm"] Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.156949 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.157086 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.157250 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.157358 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.157506 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.157622 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.157725 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.157821 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.157921 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.158027 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.158139 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.158313 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.158430 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.158527 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="extract-content" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.158635 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.158729 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: E0307 07:16:51.158845 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.158959 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="extract-utilities" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.159280 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="be13512b-82bf-443f-b0a5-0913e77b0099" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.159420 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="52d2740c-c4da-4ae1-9788-3cfb9827d312" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.159534 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a174fb-941f-49cf-b45c-f0911b621231" containerName="registry-server" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.161071 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.163068 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.168583 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm"] Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.267192 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.267367 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6c5\" (UniqueName: \"kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.267725 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.369613 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.369731 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6c5\" (UniqueName: \"kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.369791 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.370076 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.370109 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.389453 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6c5\" (UniqueName: \"kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5\") pod \"a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:51 crc kubenswrapper[4738]: I0307 07:16:51.510626 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:52 crc kubenswrapper[4738]: I0307 07:16:52.002082 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm"] Mar 07 07:16:52 crc kubenswrapper[4738]: W0307 07:16:52.013198 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51f3d627_e5bd_4ffe_8b54_79c4fd8a86ef.slice/crio-174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee WatchSource:0}: Error finding container 174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee: Status 404 returned error can't find the container with id 174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee Mar 07 07:16:52 crc kubenswrapper[4738]: I0307 07:16:52.044231 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" event={"ID":"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef","Type":"ContainerStarted","Data":"174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee"} Mar 07 07:16:53 crc kubenswrapper[4738]: I0307 07:16:53.054478 4738 generic.go:334] "Generic (PLEG): container finished" podID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerID="6e989a41a5cfc306daedaebca17dcec1ea98d9651c97e8b530e576d61bc6edc6" exitCode=0 Mar 07 07:16:53 crc kubenswrapper[4738]: I0307 07:16:53.054550 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" event={"ID":"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef","Type":"ContainerDied","Data":"6e989a41a5cfc306daedaebca17dcec1ea98d9651c97e8b530e576d61bc6edc6"} Mar 07 07:16:54 crc kubenswrapper[4738]: I0307 07:16:54.065390 4738 generic.go:334] "Generic (PLEG): container finished" podID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerID="da03914566652745a1800a0f591b481fef2449145d86d8cb86352c634f11acc9" exitCode=0 Mar 07 07:16:54 crc kubenswrapper[4738]: I0307 07:16:54.065458 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" event={"ID":"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef","Type":"ContainerDied","Data":"da03914566652745a1800a0f591b481fef2449145d86d8cb86352c634f11acc9"} Mar 07 07:16:55 crc kubenswrapper[4738]: I0307 07:16:55.084841 4738 generic.go:334] "Generic (PLEG): container finished" podID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerID="24a6e6ae9ad4df14956102d2aeef054dcef0640837fa9ad1fc39cbc9c14cba25" exitCode=0 Mar 07 07:16:55 crc kubenswrapper[4738]: I0307 07:16:55.084947 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" event={"ID":"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef","Type":"ContainerDied","Data":"24a6e6ae9ad4df14956102d2aeef054dcef0640837fa9ad1fc39cbc9c14cba25"} Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.427088 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.544974 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jw6c5\" (UniqueName: \"kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5\") pod \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.545218 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle\") pod \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.545273 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util\") pod \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\" (UID: \"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef\") " Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.549614 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle" (OuterVolumeSpecName: "bundle") pod "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" (UID: "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.554295 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5" (OuterVolumeSpecName: "kube-api-access-jw6c5") pod "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" (UID: "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef"). InnerVolumeSpecName "kube-api-access-jw6c5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.577728 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util" (OuterVolumeSpecName: "util") pod "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" (UID: "51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.647531 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.647587 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:56 crc kubenswrapper[4738]: I0307 07:16:56.647607 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jw6c5\" (UniqueName: \"kubernetes.io/projected/51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef-kube-api-access-jw6c5\") on node \"crc\" DevicePath \"\"" Mar 07 07:16:57 crc kubenswrapper[4738]: I0307 07:16:57.100501 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" event={"ID":"51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef","Type":"ContainerDied","Data":"174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee"} Mar 07 07:16:57 crc kubenswrapper[4738]: I0307 07:16:57.100948 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="174cfee08e9b24bbba6906a11d523617e18def4f1ecd70fe86dcaa8d4dafcbee" Mar 07 07:16:57 crc kubenswrapper[4738]: I0307 07:16:57.100669 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.113426 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw"] Mar 07 07:17:07 crc kubenswrapper[4738]: E0307 07:17:07.114111 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="pull" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.114125 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="pull" Mar 07 07:17:07 crc kubenswrapper[4738]: E0307 07:17:07.114136 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="extract" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.114143 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="extract" Mar 07 07:17:07 crc kubenswrapper[4738]: E0307 07:17:07.114182 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="util" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.114188 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="util" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.114295 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef" containerName="extract" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.114849 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.119410 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.123964 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zn6wb" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.142101 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw"] Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.203254 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6pn8\" (UniqueName: \"kubernetes.io/projected/02e16c67-848e-4269-bb82-09ace460b9fe-kube-api-access-g6pn8\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.203339 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-webhook-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.203386 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-apiservice-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.305112 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6pn8\" (UniqueName: \"kubernetes.io/projected/02e16c67-848e-4269-bb82-09ace460b9fe-kube-api-access-g6pn8\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.305278 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-webhook-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.305335 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-apiservice-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.312115 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-apiservice-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.312222 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/02e16c67-848e-4269-bb82-09ace460b9fe-webhook-cert\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.327955 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6pn8\" (UniqueName: \"kubernetes.io/projected/02e16c67-848e-4269-bb82-09ace460b9fe-kube-api-access-g6pn8\") pod \"keystone-operator-controller-manager-67bb98769b-bs7fw\" (UID: \"02e16c67-848e-4269-bb82-09ace460b9fe\") " pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.443739 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.903022 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.904782 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.924409 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:07 crc kubenswrapper[4738]: I0307 07:17:07.948256 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw"] Mar 07 07:17:07 crc kubenswrapper[4738]: W0307 07:17:07.980464 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02e16c67_848e_4269_bb82_09ace460b9fe.slice/crio-0dbbf5089a4ebf936db180d9335d54bccdbfca27703a7bf7b5324e134d4df04f WatchSource:0}: Error finding container 0dbbf5089a4ebf936db180d9335d54bccdbfca27703a7bf7b5324e134d4df04f: Status 404 returned error can't find the container with id 0dbbf5089a4ebf936db180d9335d54bccdbfca27703a7bf7b5324e134d4df04f Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.016108 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.016183 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vblxh\" (UniqueName: \"kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.016442 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.117466 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.118368 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vblxh\" (UniqueName: \"kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.118067 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.118505 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.119067 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.143915 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vblxh\" (UniqueName: \"kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh\") pod \"community-operators-r8qnh\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.182904 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" event={"ID":"02e16c67-848e-4269-bb82-09ace460b9fe","Type":"ContainerStarted","Data":"0dbbf5089a4ebf936db180d9335d54bccdbfca27703a7bf7b5324e134d4df04f"} Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.244084 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:08 crc kubenswrapper[4738]: I0307 07:17:08.740973 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:09 crc kubenswrapper[4738]: I0307 07:17:09.192878 4738 generic.go:334] "Generic (PLEG): container finished" podID="3f61d3e1-a31b-4e55-8826-0613669b010c" containerID="96fb6a3051ecf8b552e0ae49820ceac6054d137eb99d0cc6624168364e9c4aff" exitCode=0 Mar 07 07:17:09 crc kubenswrapper[4738]: I0307 07:17:09.192969 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"3f61d3e1-a31b-4e55-8826-0613669b010c","Type":"ContainerDied","Data":"96fb6a3051ecf8b552e0ae49820ceac6054d137eb99d0cc6624168364e9c4aff"} Mar 07 07:17:09 crc kubenswrapper[4738]: I0307 07:17:09.197664 4738 generic.go:334] "Generic (PLEG): container finished" podID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerID="f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df" exitCode=0 Mar 07 07:17:09 crc kubenswrapper[4738]: I0307 07:17:09.197712 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerDied","Data":"f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df"} Mar 07 07:17:09 crc kubenswrapper[4738]: I0307 07:17:09.197743 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerStarted","Data":"242733c5a6532c31a2d2533d37d58fe99d8e537620db104c3284b1f76f78e947"} Mar 07 07:17:10 crc kubenswrapper[4738]: I0307 07:17:10.207945 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/rabbitmq-server-0" event={"ID":"3f61d3e1-a31b-4e55-8826-0613669b010c","Type":"ContainerStarted","Data":"70518dc3c314a2238bddfb8521103442dfeaf5258b29876fdecacf006b972aee"} Mar 07 07:17:10 crc kubenswrapper[4738]: I0307 07:17:10.208450 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:17:10 crc kubenswrapper[4738]: I0307 07:17:10.249078 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/rabbitmq-server-0" podStartSLOduration=36.137774227 podStartE2EDuration="44.249062333s" podCreationTimestamp="2026-03-07 07:16:26 +0000 UTC" firstStartedPulling="2026-03-07 07:16:28.131834041 +0000 UTC m=+1006.596821362" lastFinishedPulling="2026-03-07 07:16:36.243122127 +0000 UTC m=+1014.708109468" observedRunningTime="2026-03-07 07:17:10.24566818 +0000 UTC m=+1048.710655521" watchObservedRunningTime="2026-03-07 07:17:10.249062333 +0000 UTC m=+1048.714049674" Mar 07 07:17:12 crc kubenswrapper[4738]: I0307 07:17:12.222731 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" event={"ID":"02e16c67-848e-4269-bb82-09ace460b9fe","Type":"ContainerStarted","Data":"e522c72ad595f2d20769e00804f9699c585c1c19a7ef5247c87d92607d7563ca"} Mar 07 07:17:12 crc kubenswrapper[4738]: I0307 07:17:12.223228 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:12 crc kubenswrapper[4738]: I0307 07:17:12.224868 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerStarted","Data":"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d"} Mar 07 07:17:12 crc kubenswrapper[4738]: I0307 07:17:12.253960 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" podStartSLOduration=1.353818226 podStartE2EDuration="5.253941576s" podCreationTimestamp="2026-03-07 07:17:07 +0000 UTC" firstStartedPulling="2026-03-07 07:17:07.983841434 +0000 UTC m=+1046.448828755" lastFinishedPulling="2026-03-07 07:17:11.883964784 +0000 UTC m=+1050.348952105" observedRunningTime="2026-03-07 07:17:12.249697051 +0000 UTC m=+1050.714684372" watchObservedRunningTime="2026-03-07 07:17:12.253941576 +0000 UTC m=+1050.718928897" Mar 07 07:17:13 crc kubenswrapper[4738]: I0307 07:17:13.235107 4738 generic.go:334] "Generic (PLEG): container finished" podID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerID="efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d" exitCode=0 Mar 07 07:17:13 crc kubenswrapper[4738]: I0307 07:17:13.236181 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerDied","Data":"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d"} Mar 07 07:17:14 crc kubenswrapper[4738]: I0307 07:17:14.245270 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerStarted","Data":"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce"} Mar 07 07:17:17 crc kubenswrapper[4738]: I0307 07:17:17.450004 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-67bb98769b-bs7fw" Mar 07 07:17:17 crc kubenswrapper[4738]: I0307 07:17:17.475390 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-r8qnh" podStartSLOduration=5.959065467 podStartE2EDuration="10.475364271s" podCreationTimestamp="2026-03-07 07:17:07 +0000 UTC" firstStartedPulling="2026-03-07 07:17:09.200336973 +0000 UTC m=+1047.665324294" lastFinishedPulling="2026-03-07 07:17:13.716635777 +0000 UTC m=+1052.181623098" observedRunningTime="2026-03-07 07:17:14.28875429 +0000 UTC m=+1052.753741601" watchObservedRunningTime="2026-03-07 07:17:17.475364271 +0000 UTC m=+1055.940351632" Mar 07 07:17:18 crc kubenswrapper[4738]: I0307 07:17:18.244514 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:18 crc kubenswrapper[4738]: I0307 07:17:18.244593 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:18 crc kubenswrapper[4738]: I0307 07:17:18.321897 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:18 crc kubenswrapper[4738]: I0307 07:17:18.395200 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:20 crc kubenswrapper[4738]: I0307 07:17:20.489067 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:20 crc kubenswrapper[4738]: I0307 07:17:20.489637 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-r8qnh" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="registry-server" containerID="cri-o://dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce" gracePeriod=2 Mar 07 07:17:20 crc kubenswrapper[4738]: I0307 07:17:20.935846 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.034855 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vblxh\" (UniqueName: \"kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh\") pod \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.035036 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities\") pod \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.035118 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content\") pod \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\" (UID: \"8bfa29ec-3e0b-41c1-9841-a4a82c609169\") " Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.036246 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities" (OuterVolumeSpecName: "utilities") pod "8bfa29ec-3e0b-41c1-9841-a4a82c609169" (UID: "8bfa29ec-3e0b-41c1-9841-a4a82c609169"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.043068 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh" (OuterVolumeSpecName: "kube-api-access-vblxh") pod "8bfa29ec-3e0b-41c1-9841-a4a82c609169" (UID: "8bfa29ec-3e0b-41c1-9841-a4a82c609169"). InnerVolumeSpecName "kube-api-access-vblxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.121062 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8bfa29ec-3e0b-41c1-9841-a4a82c609169" (UID: "8bfa29ec-3e0b-41c1-9841-a4a82c609169"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.137440 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vblxh\" (UniqueName: \"kubernetes.io/projected/8bfa29ec-3e0b-41c1-9841-a4a82c609169-kube-api-access-vblxh\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.137494 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.137507 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8bfa29ec-3e0b-41c1-9841-a4a82c609169-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.309478 4738 generic.go:334] "Generic (PLEG): container finished" podID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerID="dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce" exitCode=0 Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.309580 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-r8qnh" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.309573 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerDied","Data":"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce"} Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.309881 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-r8qnh" event={"ID":"8bfa29ec-3e0b-41c1-9841-a4a82c609169","Type":"ContainerDied","Data":"242733c5a6532c31a2d2533d37d58fe99d8e537620db104c3284b1f76f78e947"} Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.310042 4738 scope.go:117] "RemoveContainer" containerID="dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.341109 4738 scope.go:117] "RemoveContainer" containerID="efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.355305 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.364838 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-r8qnh"] Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.386180 4738 scope.go:117] "RemoveContainer" containerID="f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.413526 4738 scope.go:117] "RemoveContainer" containerID="dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce" Mar 07 07:17:21 crc kubenswrapper[4738]: E0307 07:17:21.414427 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce\": container with ID starting with dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce not found: ID does not exist" containerID="dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.414482 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce"} err="failed to get container status \"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce\": rpc error: code = NotFound desc = could not find container \"dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce\": container with ID starting with dadaabd10a190b6b264f52e56ea91b7b5a6dc031e84ceb21fb58626ec088c4ce not found: ID does not exist" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.414516 4738 scope.go:117] "RemoveContainer" containerID="efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d" Mar 07 07:17:21 crc kubenswrapper[4738]: E0307 07:17:21.415050 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d\": container with ID starting with efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d not found: ID does not exist" containerID="efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.415123 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d"} err="failed to get container status \"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d\": rpc error: code = NotFound desc = could not find container \"efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d\": container with ID starting with efb583e7b05a2c512790962677863a3b48808f5eec00658428a00f1add11b53d not found: ID does not exist" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.415194 4738 scope.go:117] "RemoveContainer" containerID="f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df" Mar 07 07:17:21 crc kubenswrapper[4738]: E0307 07:17:21.416514 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df\": container with ID starting with f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df not found: ID does not exist" containerID="f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df" Mar 07 07:17:21 crc kubenswrapper[4738]: I0307 07:17:21.416543 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df"} err="failed to get container status \"f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df\": rpc error: code = NotFound desc = could not find container \"f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df\": container with ID starting with f0f0525f54e46de6bfd55a22853747f5de0ac61b273859e23cf14de3b3d5f6df not found: ID does not exist" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.398271 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" path="/var/lib/kubelet/pods/8bfa29ec-3e0b-41c1-9841-a4a82c609169/volumes" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.445198 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-create-rv7jl"] Mar 07 07:17:22 crc kubenswrapper[4738]: E0307 07:17:22.445560 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="extract-content" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.445587 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="extract-content" Mar 07 07:17:22 crc kubenswrapper[4738]: E0307 07:17:22.445614 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="registry-server" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.445625 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="registry-server" Mar 07 07:17:22 crc kubenswrapper[4738]: E0307 07:17:22.445660 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="extract-utilities" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.445672 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="extract-utilities" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.445899 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8bfa29ec-3e0b-41c1-9841-a4a82c609169" containerName="registry-server" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.446671 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.464589 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc"] Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.465671 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.473471 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-db-secret" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.483408 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-rv7jl"] Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.560465 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.560532 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnwjk\" (UniqueName: \"kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.647035 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc"] Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.661958 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.662004 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnwjk\" (UniqueName: \"kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.662052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.662092 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjm5\" (UniqueName: \"kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.662747 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.683026 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnwjk\" (UniqueName: \"kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk\") pod \"keystone-db-create-rv7jl\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.763112 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjm5\" (UniqueName: \"kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.763262 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.763861 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.796526 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjm5\" (UniqueName: \"kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5\") pod \"keystone-c2f4-account-create-update-96rbc\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.834402 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:22 crc kubenswrapper[4738]: I0307 07:17:22.847903 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.112523 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-create-rv7jl"] Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.165193 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc"] Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.327518 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" event={"ID":"b35e853f-e4d7-4cea-be6c-e2405e776b20","Type":"ContainerStarted","Data":"7693ad0835e1960d2671c5be0e94b2c12b5852315d22f365a14b62b8b764fa9c"} Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.327567 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" event={"ID":"b35e853f-e4d7-4cea-be6c-e2405e776b20","Type":"ContainerStarted","Data":"7905b4dfd42c9ba4b29d5687ad76d447b0698beb568bf5d71aa42919572ca4cb"} Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.329117 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-rv7jl" event={"ID":"56eba8ea-c92b-40b6-84bf-b37e6327176f","Type":"ContainerStarted","Data":"3aa24e37d0d3312d3218ee6e4cef6c0b1eb41c0b0f78abcc469c1f6012c723ea"} Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.329188 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-rv7jl" event={"ID":"56eba8ea-c92b-40b6-84bf-b37e6327176f","Type":"ContainerStarted","Data":"0ce2999a72199399650bcbfea11639487e25624a2043889a06de81847359af09"} Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.350318 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" podStartSLOduration=1.35028864 podStartE2EDuration="1.35028864s" podCreationTimestamp="2026-03-07 07:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:17:23.344086753 +0000 UTC m=+1061.809074074" watchObservedRunningTime="2026-03-07 07:17:23.35028864 +0000 UTC m=+1061.815275961" Mar 07 07:17:23 crc kubenswrapper[4738]: I0307 07:17:23.379802 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-create-rv7jl" podStartSLOduration=1.379773219 podStartE2EDuration="1.379773219s" podCreationTimestamp="2026-03-07 07:17:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:17:23.363824517 +0000 UTC m=+1061.828811838" watchObservedRunningTime="2026-03-07 07:17:23.379773219 +0000 UTC m=+1061.844760550" Mar 07 07:17:24 crc kubenswrapper[4738]: I0307 07:17:24.339114 4738 generic.go:334] "Generic (PLEG): container finished" podID="56eba8ea-c92b-40b6-84bf-b37e6327176f" containerID="3aa24e37d0d3312d3218ee6e4cef6c0b1eb41c0b0f78abcc469c1f6012c723ea" exitCode=0 Mar 07 07:17:24 crc kubenswrapper[4738]: I0307 07:17:24.339256 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-rv7jl" event={"ID":"56eba8ea-c92b-40b6-84bf-b37e6327176f","Type":"ContainerDied","Data":"3aa24e37d0d3312d3218ee6e4cef6c0b1eb41c0b0f78abcc469c1f6012c723ea"} Mar 07 07:17:24 crc kubenswrapper[4738]: I0307 07:17:24.343861 4738 generic.go:334] "Generic (PLEG): container finished" podID="b35e853f-e4d7-4cea-be6c-e2405e776b20" containerID="7693ad0835e1960d2671c5be0e94b2c12b5852315d22f365a14b62b8b764fa9c" exitCode=0 Mar 07 07:17:24 crc kubenswrapper[4738]: I0307 07:17:24.343938 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" event={"ID":"b35e853f-e4d7-4cea-be6c-e2405e776b20","Type":"ContainerDied","Data":"7693ad0835e1960d2671c5be0e94b2c12b5852315d22f365a14b62b8b764fa9c"} Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.681525 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.687031 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.821468 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmjm5\" (UniqueName: \"kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5\") pod \"b35e853f-e4d7-4cea-be6c-e2405e776b20\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.821556 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts\") pod \"b35e853f-e4d7-4cea-be6c-e2405e776b20\" (UID: \"b35e853f-e4d7-4cea-be6c-e2405e776b20\") " Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.821646 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnwjk\" (UniqueName: \"kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk\") pod \"56eba8ea-c92b-40b6-84bf-b37e6327176f\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.821737 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts\") pod \"56eba8ea-c92b-40b6-84bf-b37e6327176f\" (UID: \"56eba8ea-c92b-40b6-84bf-b37e6327176f\") " Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.822684 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b35e853f-e4d7-4cea-be6c-e2405e776b20" (UID: "b35e853f-e4d7-4cea-be6c-e2405e776b20"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.822769 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56eba8ea-c92b-40b6-84bf-b37e6327176f" (UID: "56eba8ea-c92b-40b6-84bf-b37e6327176f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.829041 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5" (OuterVolumeSpecName: "kube-api-access-lmjm5") pod "b35e853f-e4d7-4cea-be6c-e2405e776b20" (UID: "b35e853f-e4d7-4cea-be6c-e2405e776b20"). InnerVolumeSpecName "kube-api-access-lmjm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.829447 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk" (OuterVolumeSpecName: "kube-api-access-tnwjk") pod "56eba8ea-c92b-40b6-84bf-b37e6327176f" (UID: "56eba8ea-c92b-40b6-84bf-b37e6327176f"). InnerVolumeSpecName "kube-api-access-tnwjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.923721 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmjm5\" (UniqueName: \"kubernetes.io/projected/b35e853f-e4d7-4cea-be6c-e2405e776b20-kube-api-access-lmjm5\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.923776 4738 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b35e853f-e4d7-4cea-be6c-e2405e776b20-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.923798 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnwjk\" (UniqueName: \"kubernetes.io/projected/56eba8ea-c92b-40b6-84bf-b37e6327176f-kube-api-access-tnwjk\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:25 crc kubenswrapper[4738]: I0307 07:17:25.923818 4738 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56eba8ea-c92b-40b6-84bf-b37e6327176f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.302331 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-index-76wqn"] Mar 07 07:17:26 crc kubenswrapper[4738]: E0307 07:17:26.302636 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56eba8ea-c92b-40b6-84bf-b37e6327176f" containerName="mariadb-database-create" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.302652 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="56eba8ea-c92b-40b6-84bf-b37e6327176f" containerName="mariadb-database-create" Mar 07 07:17:26 crc kubenswrapper[4738]: E0307 07:17:26.302670 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b35e853f-e4d7-4cea-be6c-e2405e776b20" containerName="mariadb-account-create-update" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.302678 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b35e853f-e4d7-4cea-be6c-e2405e776b20" containerName="mariadb-account-create-update" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.302863 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="56eba8ea-c92b-40b6-84bf-b37e6327176f" containerName="mariadb-database-create" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.302898 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b35e853f-e4d7-4cea-be6c-e2405e776b20" containerName="mariadb-account-create-update" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.303503 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.308520 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-index-dockercfg-wlgvk" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.319000 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-76wqn"] Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.360325 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-create-rv7jl" event={"ID":"56eba8ea-c92b-40b6-84bf-b37e6327176f","Type":"ContainerDied","Data":"0ce2999a72199399650bcbfea11639487e25624a2043889a06de81847359af09"} Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.360359 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-create-rv7jl" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.360369 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ce2999a72199399650bcbfea11639487e25624a2043889a06de81847359af09" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.362974 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" event={"ID":"b35e853f-e4d7-4cea-be6c-e2405e776b20","Type":"ContainerDied","Data":"7905b4dfd42c9ba4b29d5687ad76d447b0698beb568bf5d71aa42919572ca4cb"} Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.363001 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.363002 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7905b4dfd42c9ba4b29d5687ad76d447b0698beb568bf5d71aa42919572ca4cb" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.431433 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrmrj\" (UniqueName: \"kubernetes.io/projected/3b87b2f5-b8bf-451f-907e-c47b1c197381-kube-api-access-hrmrj\") pod \"barbican-operator-index-76wqn\" (UID: \"3b87b2f5-b8bf-451f-907e-c47b1c197381\") " pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.532844 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrmrj\" (UniqueName: \"kubernetes.io/projected/3b87b2f5-b8bf-451f-907e-c47b1c197381-kube-api-access-hrmrj\") pod \"barbican-operator-index-76wqn\" (UID: \"3b87b2f5-b8bf-451f-907e-c47b1c197381\") " pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.553691 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrmrj\" (UniqueName: \"kubernetes.io/projected/3b87b2f5-b8bf-451f-907e-c47b1c197381-kube-api-access-hrmrj\") pod \"barbican-operator-index-76wqn\" (UID: \"3b87b2f5-b8bf-451f-907e-c47b1c197381\") " pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:26 crc kubenswrapper[4738]: I0307 07:17:26.623927 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:27 crc kubenswrapper[4738]: I0307 07:17:27.107563 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-index-76wqn"] Mar 07 07:17:27 crc kubenswrapper[4738]: W0307 07:17:27.116833 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b87b2f5_b8bf_451f_907e_c47b1c197381.slice/crio-7750f673b2a81bdd9a2c54bd0b1447ad1e039c11fafe81605d0ef96108b80c6c WatchSource:0}: Error finding container 7750f673b2a81bdd9a2c54bd0b1447ad1e039c11fafe81605d0ef96108b80c6c: Status 404 returned error can't find the container with id 7750f673b2a81bdd9a2c54bd0b1447ad1e039c11fafe81605d0ef96108b80c6c Mar 07 07:17:27 crc kubenswrapper[4738]: I0307 07:17:27.370636 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-76wqn" event={"ID":"3b87b2f5-b8bf-451f-907e-c47b1c197381","Type":"ContainerStarted","Data":"7750f673b2a81bdd9a2c54bd0b1447ad1e039c11fafe81605d0ef96108b80c6c"} Mar 07 07:17:27 crc kubenswrapper[4738]: I0307 07:17:27.824512 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/rabbitmq-server-0" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.283196 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-db-sync-4d4vj"] Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.284187 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.286770 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.287534 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-j2zpt" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.287725 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.294559 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.295945 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-4d4vj"] Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.385548 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzvdp\" (UniqueName: \"kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.385719 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.486656 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzvdp\" (UniqueName: \"kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.486755 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.503766 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.506036 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzvdp\" (UniqueName: \"kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp\") pod \"keystone-db-sync-4d4vj\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:28 crc kubenswrapper[4738]: I0307 07:17:28.605436 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:29 crc kubenswrapper[4738]: I0307 07:17:29.089573 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-4d4vj"] Mar 07 07:17:29 crc kubenswrapper[4738]: W0307 07:17:29.094766 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f13db65_a5b3_4f0d_bdc1_2d728eaa60ee.slice/crio-a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428 WatchSource:0}: Error finding container a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428: Status 404 returned error can't find the container with id a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428 Mar 07 07:17:29 crc kubenswrapper[4738]: I0307 07:17:29.391128 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-index-76wqn" event={"ID":"3b87b2f5-b8bf-451f-907e-c47b1c197381","Type":"ContainerStarted","Data":"10aa6a67b9375795da8fab7fc70265e1d7f4826b5f375a00d977c3a08897cb8c"} Mar 07 07:17:29 crc kubenswrapper[4738]: I0307 07:17:29.393042 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" event={"ID":"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee","Type":"ContainerStarted","Data":"a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428"} Mar 07 07:17:29 crc kubenswrapper[4738]: I0307 07:17:29.411046 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-index-76wqn" podStartSLOduration=1.860099189 podStartE2EDuration="3.411025219s" podCreationTimestamp="2026-03-07 07:17:26 +0000 UTC" firstStartedPulling="2026-03-07 07:17:27.12032615 +0000 UTC m=+1065.585313471" lastFinishedPulling="2026-03-07 07:17:28.67125217 +0000 UTC m=+1067.136239501" observedRunningTime="2026-03-07 07:17:29.404129401 +0000 UTC m=+1067.869116722" watchObservedRunningTime="2026-03-07 07:17:29.411025219 +0000 UTC m=+1067.876012540" Mar 07 07:17:36 crc kubenswrapper[4738]: I0307 07:17:36.479348 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" event={"ID":"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee","Type":"ContainerStarted","Data":"7fa14d51a8680009a813b1fcffd1385164c6dc55cef6e173c734467b48a41482"} Mar 07 07:17:36 crc kubenswrapper[4738]: I0307 07:17:36.498486 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" podStartSLOduration=2.001024805 podStartE2EDuration="8.49846442s" podCreationTimestamp="2026-03-07 07:17:28 +0000 UTC" firstStartedPulling="2026-03-07 07:17:29.097149805 +0000 UTC m=+1067.562137126" lastFinishedPulling="2026-03-07 07:17:35.59458943 +0000 UTC m=+1074.059576741" observedRunningTime="2026-03-07 07:17:36.498296345 +0000 UTC m=+1074.963283676" watchObservedRunningTime="2026-03-07 07:17:36.49846442 +0000 UTC m=+1074.963451741" Mar 07 07:17:36 crc kubenswrapper[4738]: I0307 07:17:36.624774 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:36 crc kubenswrapper[4738]: I0307 07:17:36.624900 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:36 crc kubenswrapper[4738]: I0307 07:17:36.659516 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:37 crc kubenswrapper[4738]: I0307 07:17:37.523869 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-index-76wqn" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.509324 4738 generic.go:334] "Generic (PLEG): container finished" podID="6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" containerID="7fa14d51a8680009a813b1fcffd1385164c6dc55cef6e173c734467b48a41482" exitCode=0 Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.509431 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" event={"ID":"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee","Type":"ContainerDied","Data":"7fa14d51a8680009a813b1fcffd1385164c6dc55cef6e173c734467b48a41482"} Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.743050 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6"] Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.744692 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.747016 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.766469 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6"] Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.882123 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.882261 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hbw9\" (UniqueName: \"kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.882309 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.983709 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.983780 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hbw9\" (UniqueName: \"kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.983818 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.985116 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:39 crc kubenswrapper[4738]: I0307 07:17:39.985401 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.004402 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hbw9\" (UniqueName: \"kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9\") pod \"c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.068600 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.549496 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6"] Mar 07 07:17:40 crc kubenswrapper[4738]: W0307 07:17:40.557049 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ab2f06d_a208_4b05_a799_10ff718be96f.slice/crio-73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e WatchSource:0}: Error finding container 73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e: Status 404 returned error can't find the container with id 73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.738409 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.902781 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data\") pod \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.902916 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzvdp\" (UniqueName: \"kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp\") pod \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\" (UID: \"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee\") " Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.909591 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp" (OuterVolumeSpecName: "kube-api-access-pzvdp") pod "6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" (UID: "6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee"). InnerVolumeSpecName "kube-api-access-pzvdp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:40 crc kubenswrapper[4738]: I0307 07:17:40.935349 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data" (OuterVolumeSpecName: "config-data") pod "6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" (UID: "6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.004376 4738 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.004417 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pzvdp\" (UniqueName: \"kubernetes.io/projected/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee-kube-api-access-pzvdp\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.524359 4738 generic.go:334] "Generic (PLEG): container finished" podID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerID="75eee2baf34857889c3a535017c4d8e5049001a2951b65e6c88ab615e37b0926" exitCode=0 Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.524443 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" event={"ID":"9ab2f06d-a208-4b05-a799-10ff718be96f","Type":"ContainerDied","Data":"75eee2baf34857889c3a535017c4d8e5049001a2951b65e6c88ab615e37b0926"} Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.524881 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" event={"ID":"9ab2f06d-a208-4b05-a799-10ff718be96f","Type":"ContainerStarted","Data":"73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e"} Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.527461 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" event={"ID":"6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee","Type":"ContainerDied","Data":"a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428"} Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.527512 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a549fdb4fd132aa26bc7123699941a5e03e2b791ae397c500e5964d8c631b428" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.527519 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-db-sync-4d4vj" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.763045 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-b44m6"] Mar 07 07:17:41 crc kubenswrapper[4738]: E0307 07:17:41.763473 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" containerName="keystone-db-sync" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.763494 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" containerName="keystone-db-sync" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.763692 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" containerName="keystone-db-sync" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.764403 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.767976 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-j2zpt" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.768125 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.768304 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.768408 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.768556 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"osp-secret" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.776700 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-b44m6"] Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.817356 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.817420 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xmhw\" (UniqueName: \"kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.817444 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.817469 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.817489 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.918811 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.918860 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.918944 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.918988 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xmhw\" (UniqueName: \"kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.919008 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.925012 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.925318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.927302 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.929021 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:41 crc kubenswrapper[4738]: I0307 07:17:41.943202 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xmhw\" (UniqueName: \"kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw\") pod \"keystone-bootstrap-b44m6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:42 crc kubenswrapper[4738]: I0307 07:17:42.080660 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:42 crc kubenswrapper[4738]: I0307 07:17:42.314250 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-b44m6"] Mar 07 07:17:42 crc kubenswrapper[4738]: I0307 07:17:42.536724 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" event={"ID":"44b010f6-1ee6-4660-a1dc-83507adb2ba6","Type":"ContainerStarted","Data":"903ef9e3e019fceb511f26aac2b17b266248756bd97a84e411e203273d559ca2"} Mar 07 07:17:42 crc kubenswrapper[4738]: I0307 07:17:42.537050 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" event={"ID":"44b010f6-1ee6-4660-a1dc-83507adb2ba6","Type":"ContainerStarted","Data":"a61364241f01ca27ae3e6018e380d951ae748fb916992d8100d81310f567cced"} Mar 07 07:17:42 crc kubenswrapper[4738]: I0307 07:17:42.563791 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" podStartSLOduration=1.563768791 podStartE2EDuration="1.563768791s" podCreationTimestamp="2026-03-07 07:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:17:42.557944383 +0000 UTC m=+1081.022931714" watchObservedRunningTime="2026-03-07 07:17:42.563768791 +0000 UTC m=+1081.028756112" Mar 07 07:17:43 crc kubenswrapper[4738]: I0307 07:17:43.548192 4738 generic.go:334] "Generic (PLEG): container finished" podID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerID="49cf7f4b33ab1782f4bc65e6c9b13d9ac661e039f2827841b8ec4cf35e98ea90" exitCode=0 Mar 07 07:17:43 crc kubenswrapper[4738]: I0307 07:17:43.548309 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" event={"ID":"9ab2f06d-a208-4b05-a799-10ff718be96f","Type":"ContainerDied","Data":"49cf7f4b33ab1782f4bc65e6c9b13d9ac661e039f2827841b8ec4cf35e98ea90"} Mar 07 07:17:44 crc kubenswrapper[4738]: I0307 07:17:44.558140 4738 generic.go:334] "Generic (PLEG): container finished" podID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerID="f2625a9ccbf78a67596902a4f64ba7d04d5d9c305b16bebc03e021798e8bb306" exitCode=0 Mar 07 07:17:44 crc kubenswrapper[4738]: I0307 07:17:44.558233 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" event={"ID":"9ab2f06d-a208-4b05-a799-10ff718be96f","Type":"ContainerDied","Data":"f2625a9ccbf78a67596902a4f64ba7d04d5d9c305b16bebc03e021798e8bb306"} Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.568545 4738 generic.go:334] "Generic (PLEG): container finished" podID="44b010f6-1ee6-4660-a1dc-83507adb2ba6" containerID="903ef9e3e019fceb511f26aac2b17b266248756bd97a84e411e203273d559ca2" exitCode=0 Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.568608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" event={"ID":"44b010f6-1ee6-4660-a1dc-83507adb2ba6","Type":"ContainerDied","Data":"903ef9e3e019fceb511f26aac2b17b266248756bd97a84e411e203273d559ca2"} Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.878012 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.980818 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hbw9\" (UniqueName: \"kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9\") pod \"9ab2f06d-a208-4b05-a799-10ff718be96f\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.981061 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util\") pod \"9ab2f06d-a208-4b05-a799-10ff718be96f\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.981125 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle\") pod \"9ab2f06d-a208-4b05-a799-10ff718be96f\" (UID: \"9ab2f06d-a208-4b05-a799-10ff718be96f\") " Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.983486 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle" (OuterVolumeSpecName: "bundle") pod "9ab2f06d-a208-4b05-a799-10ff718be96f" (UID: "9ab2f06d-a208-4b05-a799-10ff718be96f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:45 crc kubenswrapper[4738]: I0307 07:17:45.988590 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9" (OuterVolumeSpecName: "kube-api-access-7hbw9") pod "9ab2f06d-a208-4b05-a799-10ff718be96f" (UID: "9ab2f06d-a208-4b05-a799-10ff718be96f"). InnerVolumeSpecName "kube-api-access-7hbw9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.003643 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util" (OuterVolumeSpecName: "util") pod "9ab2f06d-a208-4b05-a799-10ff718be96f" (UID: "9ab2f06d-a208-4b05-a799-10ff718be96f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.083441 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.083511 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ab2f06d-a208-4b05-a799-10ff718be96f-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.083522 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hbw9\" (UniqueName: \"kubernetes.io/projected/9ab2f06d-a208-4b05-a799-10ff718be96f-kube-api-access-7hbw9\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.578303 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" event={"ID":"9ab2f06d-a208-4b05-a799-10ff718be96f","Type":"ContainerDied","Data":"73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e"} Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.578340 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.578364 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d8963c72894e6fac74e7e3ee94071f2e8f778cc5a02ca8fef38f968592986e" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.883581 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.996595 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys\") pod \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.996767 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xmhw\" (UniqueName: \"kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw\") pod \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.996839 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts\") pod \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.996888 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data\") pod \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " Mar 07 07:17:46 crc kubenswrapper[4738]: I0307 07:17:46.996913 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys\") pod \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\" (UID: \"44b010f6-1ee6-4660-a1dc-83507adb2ba6\") " Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.002664 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "44b010f6-1ee6-4660-a1dc-83507adb2ba6" (UID: "44b010f6-1ee6-4660-a1dc-83507adb2ba6"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.005066 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "44b010f6-1ee6-4660-a1dc-83507adb2ba6" (UID: "44b010f6-1ee6-4660-a1dc-83507adb2ba6"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.005403 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw" (OuterVolumeSpecName: "kube-api-access-7xmhw") pod "44b010f6-1ee6-4660-a1dc-83507adb2ba6" (UID: "44b010f6-1ee6-4660-a1dc-83507adb2ba6"). InnerVolumeSpecName "kube-api-access-7xmhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.005574 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts" (OuterVolumeSpecName: "scripts") pod "44b010f6-1ee6-4660-a1dc-83507adb2ba6" (UID: "44b010f6-1ee6-4660-a1dc-83507adb2ba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.029475 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data" (OuterVolumeSpecName: "config-data") pod "44b010f6-1ee6-4660-a1dc-83507adb2ba6" (UID: "44b010f6-1ee6-4660-a1dc-83507adb2ba6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.098764 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.099072 4738 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.099185 4738 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.099279 4738 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/44b010f6-1ee6-4660-a1dc-83507adb2ba6-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.099349 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xmhw\" (UniqueName: \"kubernetes.io/projected/44b010f6-1ee6-4660-a1dc-83507adb2ba6-kube-api-access-7xmhw\") on node \"crc\" DevicePath \"\"" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.585761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" event={"ID":"44b010f6-1ee6-4660-a1dc-83507adb2ba6","Type":"ContainerDied","Data":"a61364241f01ca27ae3e6018e380d951ae748fb916992d8100d81310f567cced"} Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.586258 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61364241f01ca27ae3e6018e380d951ae748fb916992d8100d81310f567cced" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.585809 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-bootstrap-b44m6" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.675557 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/keystone-686f9bb899-4llf4"] Mar 07 07:17:47 crc kubenswrapper[4738]: E0307 07:17:47.676033 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="pull" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676127 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="pull" Mar 07 07:17:47 crc kubenswrapper[4738]: E0307 07:17:47.676202 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="extract" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676268 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="extract" Mar 07 07:17:47 crc kubenswrapper[4738]: E0307 07:17:47.676332 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44b010f6-1ee6-4660-a1dc-83507adb2ba6" containerName="keystone-bootstrap" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676381 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="44b010f6-1ee6-4660-a1dc-83507adb2ba6" containerName="keystone-bootstrap" Mar 07 07:17:47 crc kubenswrapper[4738]: E0307 07:17:47.676436 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="util" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676483 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="util" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676665 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="44b010f6-1ee6-4660-a1dc-83507adb2ba6" containerName="keystone-bootstrap" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.676735 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ab2f06d-a208-4b05-a799-10ff718be96f" containerName="extract" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.677452 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.680465 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-keystone-dockercfg-j2zpt" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.681051 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-config-data" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.681116 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.681784 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"keystone-scripts" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.691889 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-686f9bb899-4llf4"] Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.708581 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-credential-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.708657 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-config-data\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.708723 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz9q9\" (UniqueName: \"kubernetes.io/projected/fc363915-16ef-41ad-9d4a-542abfe49889-kube-api-access-rz9q9\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.708772 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-fernet-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.708818 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-scripts\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.809751 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-credential-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.810034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-config-data\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.810141 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz9q9\" (UniqueName: \"kubernetes.io/projected/fc363915-16ef-41ad-9d4a-542abfe49889-kube-api-access-rz9q9\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.810258 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-scripts\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.810333 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-fernet-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.813232 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-credential-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.814938 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-config-data\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.815665 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-fernet-keys\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.815914 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc363915-16ef-41ad-9d4a-542abfe49889-scripts\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.831051 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz9q9\" (UniqueName: \"kubernetes.io/projected/fc363915-16ef-41ad-9d4a-542abfe49889-kube-api-access-rz9q9\") pod \"keystone-686f9bb899-4llf4\" (UID: \"fc363915-16ef-41ad-9d4a-542abfe49889\") " pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:47 crc kubenswrapper[4738]: I0307 07:17:47.992062 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:48 crc kubenswrapper[4738]: I0307 07:17:48.199727 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/keystone-686f9bb899-4llf4"] Mar 07 07:17:48 crc kubenswrapper[4738]: I0307 07:17:48.594582 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" event={"ID":"fc363915-16ef-41ad-9d4a-542abfe49889","Type":"ContainerStarted","Data":"b8e09aac64f713151506238d2e07dc215a976d9c09fe0b297465e65a7af652eb"} Mar 07 07:17:48 crc kubenswrapper[4738]: I0307 07:17:48.594633 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" event={"ID":"fc363915-16ef-41ad-9d4a-542abfe49889","Type":"ContainerStarted","Data":"9c74f5fe0d2f352efa3ae70d7235e4326d2880331405ce2b498cb5cab6301441"} Mar 07 07:17:48 crc kubenswrapper[4738]: I0307 07:17:48.594796 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:17:48 crc kubenswrapper[4738]: I0307 07:17:48.658412 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" podStartSLOduration=1.658379804 podStartE2EDuration="1.658379804s" podCreationTimestamp="2026-03-07 07:17:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:17:48.641661712 +0000 UTC m=+1087.106649043" watchObservedRunningTime="2026-03-07 07:17:48.658379804 +0000 UTC m=+1087.123367155" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.235826 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm"] Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.237328 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.240042 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4r6k4" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.240452 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-service-cert" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.247030 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm"] Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.345822 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-apiservice-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.345871 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gm8n\" (UniqueName: \"kubernetes.io/projected/b5c35317-4298-4385-b73f-e73d48fb756e-kube-api-access-9gm8n\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.345948 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-webhook-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.447030 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-apiservice-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.447081 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gm8n\" (UniqueName: \"kubernetes.io/projected/b5c35317-4298-4385-b73f-e73d48fb756e-kube-api-access-9gm8n\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.447591 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-webhook-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.453712 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-apiservice-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.455825 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b5c35317-4298-4385-b73f-e73d48fb756e-webhook-cert\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.475063 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gm8n\" (UniqueName: \"kubernetes.io/projected/b5c35317-4298-4385-b73f-e73d48fb756e-kube-api-access-9gm8n\") pod \"barbican-operator-controller-manager-648b4464d8-k2jvm\" (UID: \"b5c35317-4298-4385-b73f-e73d48fb756e\") " pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:55 crc kubenswrapper[4738]: I0307 07:17:55.570855 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:17:56 crc kubenswrapper[4738]: I0307 07:17:56.064106 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm"] Mar 07 07:17:56 crc kubenswrapper[4738]: I0307 07:17:56.671568 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" event={"ID":"b5c35317-4298-4385-b73f-e73d48fb756e","Type":"ContainerStarted","Data":"0913d401016d02aa2c5ad2ecd328314a50895914db46028128219037e4fdb370"} Mar 07 07:17:56 crc kubenswrapper[4738]: I0307 07:17:56.958246 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:17:56 crc kubenswrapper[4738]: I0307 07:17:56.958331 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.130210 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547798-hf5hx"] Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.132277 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.138627 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.139009 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.140661 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.145982 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-hf5hx"] Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.239026 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-269kw\" (UniqueName: \"kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw\") pod \"auto-csr-approver-29547798-hf5hx\" (UID: \"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1\") " pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.340976 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-269kw\" (UniqueName: \"kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw\") pod \"auto-csr-approver-29547798-hf5hx\" (UID: \"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1\") " pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.367962 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-269kw\" (UniqueName: \"kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw\") pod \"auto-csr-approver-29547798-hf5hx\" (UID: \"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1\") " pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.449404 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.703471 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" event={"ID":"b5c35317-4298-4385-b73f-e73d48fb756e","Type":"ContainerStarted","Data":"e41cbda9c582288f07fc2354364834e682af4309c94b2ebe7a30090cd7ed3155"} Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.703807 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.728426 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" podStartSLOduration=1.960732971 podStartE2EDuration="5.728410698s" podCreationTimestamp="2026-03-07 07:17:55 +0000 UTC" firstStartedPulling="2026-03-07 07:17:56.077273465 +0000 UTC m=+1094.542260786" lastFinishedPulling="2026-03-07 07:17:59.844951192 +0000 UTC m=+1098.309938513" observedRunningTime="2026-03-07 07:18:00.725353396 +0000 UTC m=+1099.190340727" watchObservedRunningTime="2026-03-07 07:18:00.728410698 +0000 UTC m=+1099.193398019" Mar 07 07:18:00 crc kubenswrapper[4738]: I0307 07:18:00.918397 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-hf5hx"] Mar 07 07:18:01 crc kubenswrapper[4738]: I0307 07:18:01.716652 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" event={"ID":"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1","Type":"ContainerStarted","Data":"e3161654ac3fff49d1c33158bd885b715ad68dc2739e92a76b4b41a4382f5e09"} Mar 07 07:18:02 crc kubenswrapper[4738]: I0307 07:18:02.725204 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" event={"ID":"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1","Type":"ContainerDied","Data":"a02e9bf78d4168178809e4f70eefd0ac959530c049ac10ae0083653262e5efa8"} Mar 07 07:18:02 crc kubenswrapper[4738]: I0307 07:18:02.725142 4738 generic.go:334] "Generic (PLEG): container finished" podID="64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" containerID="a02e9bf78d4168178809e4f70eefd0ac959530c049ac10ae0083653262e5efa8" exitCode=0 Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.033534 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.093797 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-269kw\" (UniqueName: \"kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw\") pod \"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1\" (UID: \"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1\") " Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.099477 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw" (OuterVolumeSpecName: "kube-api-access-269kw") pod "64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" (UID: "64e3d0e4-8776-4fb3-9a4b-4205a9e755c1"). InnerVolumeSpecName "kube-api-access-269kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.195881 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-269kw\" (UniqueName: \"kubernetes.io/projected/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1-kube-api-access-269kw\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.751127 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" event={"ID":"64e3d0e4-8776-4fb3-9a4b-4205a9e755c1","Type":"ContainerDied","Data":"e3161654ac3fff49d1c33158bd885b715ad68dc2739e92a76b4b41a4382f5e09"} Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.751225 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3161654ac3fff49d1c33158bd885b715ad68dc2739e92a76b4b41a4382f5e09" Mar 07 07:18:04 crc kubenswrapper[4738]: I0307 07:18:04.751281 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547798-hf5hx" Mar 07 07:18:05 crc kubenswrapper[4738]: I0307 07:18:05.117889 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-2dlw8"] Mar 07 07:18:05 crc kubenswrapper[4738]: I0307 07:18:05.169348 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547792-2dlw8"] Mar 07 07:18:05 crc kubenswrapper[4738]: I0307 07:18:05.577244 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-648b4464d8-k2jvm" Mar 07 07:18:06 crc kubenswrapper[4738]: I0307 07:18:06.400374 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b20b3db-801f-4ee6-8026-c6f32666a798" path="/var/lib/kubelet/pods/5b20b3db-801f-4ee6-8026-c6f32666a798/volumes" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.697966 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-create-58f9g"] Mar 07 07:18:12 crc kubenswrapper[4738]: E0307 07:18:12.698628 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" containerName="oc" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.698640 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" containerName="oc" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.698767 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" containerName="oc" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.699202 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.707586 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx"] Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.709120 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.717347 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-58f9g"] Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.717651 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-db-secret" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.739522 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.739555 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twgp5\" (UniqueName: \"kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.739610 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm6d8\" (UniqueName: \"kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.739643 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.752805 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx"] Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.840772 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm6d8\" (UniqueName: \"kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.840857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.840932 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.840954 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twgp5\" (UniqueName: \"kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.842040 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.842301 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.861127 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twgp5\" (UniqueName: \"kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5\") pod \"barbican-0d0a-account-create-update-btdzx\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:12 crc kubenswrapper[4738]: I0307 07:18:12.866703 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm6d8\" (UniqueName: \"kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8\") pod \"barbican-db-create-58f9g\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.014981 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.024166 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.382555 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx"] Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.560744 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-create-58f9g"] Mar 07 07:18:13 crc kubenswrapper[4738]: W0307 07:18:13.562259 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3e20dd_feb2_4cd0_8745_4a6662a79f86.slice/crio-d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca WatchSource:0}: Error finding container d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca: Status 404 returned error can't find the container with id d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.831989 4738 generic.go:334] "Generic (PLEG): container finished" podID="7b5da009-da83-4df7-adfb-5bac20ddac17" containerID="f07d0ca50a2495e02b8e0c7621e1c001bb46286e1026f9643f6ae7d1a829a56d" exitCode=0 Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.832256 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" event={"ID":"7b5da009-da83-4df7-adfb-5bac20ddac17","Type":"ContainerDied","Data":"f07d0ca50a2495e02b8e0c7621e1c001bb46286e1026f9643f6ae7d1a829a56d"} Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.832340 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" event={"ID":"7b5da009-da83-4df7-adfb-5bac20ddac17","Type":"ContainerStarted","Data":"f6a39c2cd2f716f07b3c61e7d30bcbc6fcf6388629106544e7cbd94b07ab143d"} Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.835142 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-58f9g" event={"ID":"af3e20dd-feb2-4cd0-8745-4a6662a79f86","Type":"ContainerStarted","Data":"58dd40f7c880c5a7414a3dbfd169f1ac50f2fcbfac8ab42ffdc5abc57d1974e5"} Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.835202 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-58f9g" event={"ID":"af3e20dd-feb2-4cd0-8745-4a6662a79f86","Type":"ContainerStarted","Data":"d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca"} Mar 07 07:18:13 crc kubenswrapper[4738]: I0307 07:18:13.869505 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-db-create-58f9g" podStartSLOduration=1.869481495 podStartE2EDuration="1.869481495s" podCreationTimestamp="2026-03-07 07:18:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:18:13.86414199 +0000 UTC m=+1112.329129311" watchObservedRunningTime="2026-03-07 07:18:13.869481495 +0000 UTC m=+1112.334468856" Mar 07 07:18:14 crc kubenswrapper[4738]: I0307 07:18:14.847278 4738 generic.go:334] "Generic (PLEG): container finished" podID="af3e20dd-feb2-4cd0-8745-4a6662a79f86" containerID="58dd40f7c880c5a7414a3dbfd169f1ac50f2fcbfac8ab42ffdc5abc57d1974e5" exitCode=0 Mar 07 07:18:14 crc kubenswrapper[4738]: I0307 07:18:14.847336 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-58f9g" event={"ID":"af3e20dd-feb2-4cd0-8745-4a6662a79f86","Type":"ContainerDied","Data":"58dd40f7c880c5a7414a3dbfd169f1ac50f2fcbfac8ab42ffdc5abc57d1974e5"} Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.223868 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.282761 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts\") pod \"7b5da009-da83-4df7-adfb-5bac20ddac17\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.282836 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twgp5\" (UniqueName: \"kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5\") pod \"7b5da009-da83-4df7-adfb-5bac20ddac17\" (UID: \"7b5da009-da83-4df7-adfb-5bac20ddac17\") " Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.284549 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b5da009-da83-4df7-adfb-5bac20ddac17" (UID: "7b5da009-da83-4df7-adfb-5bac20ddac17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.290403 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5" (OuterVolumeSpecName: "kube-api-access-twgp5") pod "7b5da009-da83-4df7-adfb-5bac20ddac17" (UID: "7b5da009-da83-4df7-adfb-5bac20ddac17"). InnerVolumeSpecName "kube-api-access-twgp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.385011 4738 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b5da009-da83-4df7-adfb-5bac20ddac17-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.385045 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twgp5\" (UniqueName: \"kubernetes.io/projected/7b5da009-da83-4df7-adfb-5bac20ddac17-kube-api-access-twgp5\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.510895 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-kq2lh"] Mar 07 07:18:15 crc kubenswrapper[4738]: E0307 07:18:15.511346 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b5da009-da83-4df7-adfb-5bac20ddac17" containerName="mariadb-account-create-update" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.511371 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b5da009-da83-4df7-adfb-5bac20ddac17" containerName="mariadb-account-create-update" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.511642 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b5da009-da83-4df7-adfb-5bac20ddac17" containerName="mariadb-account-create-update" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.515245 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.519044 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-tqqb6" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.535934 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-kq2lh"] Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.588583 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnq56\" (UniqueName: \"kubernetes.io/projected/5a80278c-d06b-48e6-8ed1-129980ae7ba3-kube-api-access-jnq56\") pod \"swift-operator-index-kq2lh\" (UID: \"5a80278c-d06b-48e6-8ed1-129980ae7ba3\") " pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.690118 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnq56\" (UniqueName: \"kubernetes.io/projected/5a80278c-d06b-48e6-8ed1-129980ae7ba3-kube-api-access-jnq56\") pod \"swift-operator-index-kq2lh\" (UID: \"5a80278c-d06b-48e6-8ed1-129980ae7ba3\") " pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.719700 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnq56\" (UniqueName: \"kubernetes.io/projected/5a80278c-d06b-48e6-8ed1-129980ae7ba3-kube-api-access-jnq56\") pod \"swift-operator-index-kq2lh\" (UID: \"5a80278c-d06b-48e6-8ed1-129980ae7ba3\") " pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.842196 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.858064 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" event={"ID":"7b5da009-da83-4df7-adfb-5bac20ddac17","Type":"ContainerDied","Data":"f6a39c2cd2f716f07b3c61e7d30bcbc6fcf6388629106544e7cbd94b07ab143d"} Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.858135 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6a39c2cd2f716f07b3c61e7d30bcbc6fcf6388629106544e7cbd94b07ab143d" Mar 07 07:18:15 crc kubenswrapper[4738]: I0307 07:18:15.858089 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.298771 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.400569 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts\") pod \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.400662 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm6d8\" (UniqueName: \"kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8\") pod \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\" (UID: \"af3e20dd-feb2-4cd0-8745-4a6662a79f86\") " Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.401369 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "af3e20dd-feb2-4cd0-8745-4a6662a79f86" (UID: "af3e20dd-feb2-4cd0-8745-4a6662a79f86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.406952 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8" (OuterVolumeSpecName: "kube-api-access-gm6d8") pod "af3e20dd-feb2-4cd0-8745-4a6662a79f86" (UID: "af3e20dd-feb2-4cd0-8745-4a6662a79f86"). InnerVolumeSpecName "kube-api-access-gm6d8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.492990 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-kq2lh"] Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.503761 4738 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/af3e20dd-feb2-4cd0-8745-4a6662a79f86-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.503807 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm6d8\" (UniqueName: \"kubernetes.io/projected/af3e20dd-feb2-4cd0-8745-4a6662a79f86-kube-api-access-gm6d8\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.868350 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-create-58f9g" event={"ID":"af3e20dd-feb2-4cd0-8745-4a6662a79f86","Type":"ContainerDied","Data":"d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca"} Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.868397 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3a34d53d0e98e3b8e40a8f1adb003b89f614eb04bac1f14e2ed3fee296b11ca" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.868395 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-create-58f9g" Mar 07 07:18:16 crc kubenswrapper[4738]: I0307 07:18:16.870184 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-kq2lh" event={"ID":"5a80278c-d06b-48e6-8ed1-129980ae7ba3","Type":"ContainerStarted","Data":"4df54fae12a940a5604c65ec3a733542e1cfb0a34bf833a8498ff73e11167cef"} Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.088335 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-db-sync-9lcxq"] Mar 07 07:18:18 crc kubenswrapper[4738]: E0307 07:18:18.089519 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af3e20dd-feb2-4cd0-8745-4a6662a79f86" containerName="mariadb-database-create" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.089596 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="af3e20dd-feb2-4cd0-8745-4a6662a79f86" containerName="mariadb-database-create" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.089767 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="af3e20dd-feb2-4cd0-8745-4a6662a79f86" containerName="mariadb-database-create" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.090347 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.093064 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.094543 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-4f75z" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.102490 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-9lcxq"] Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.127541 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.127624 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sj9bf\" (UniqueName: \"kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.229218 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.229300 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sj9bf\" (UniqueName: \"kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.251927 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.256018 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sj9bf\" (UniqueName: \"kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf\") pod \"barbican-db-sync-9lcxq\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:18 crc kubenswrapper[4738]: I0307 07:18:18.411522 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:19 crc kubenswrapper[4738]: I0307 07:18:19.322365 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/keystone-686f9bb899-4llf4" Mar 07 07:18:19 crc kubenswrapper[4738]: I0307 07:18:19.619458 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-db-sync-9lcxq"] Mar 07 07:18:19 crc kubenswrapper[4738]: W0307 07:18:19.624032 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3032dbb3_8ffe_4712_a9b5_a57f22eebe73.slice/crio-43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308 WatchSource:0}: Error finding container 43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308: Status 404 returned error can't find the container with id 43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308 Mar 07 07:18:19 crc kubenswrapper[4738]: I0307 07:18:19.899981 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" event={"ID":"3032dbb3-8ffe-4712-a9b5-a57f22eebe73","Type":"ContainerStarted","Data":"43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308"} Mar 07 07:18:19 crc kubenswrapper[4738]: I0307 07:18:19.902499 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-kq2lh" event={"ID":"5a80278c-d06b-48e6-8ed1-129980ae7ba3","Type":"ContainerStarted","Data":"4b0a079e7aa1182e702ef989c67a69dc9fb994ce5e755803fe4525383994600d"} Mar 07 07:18:19 crc kubenswrapper[4738]: I0307 07:18:19.924348 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-kq2lh" podStartSLOduration=2.206015864 podStartE2EDuration="4.924317592s" podCreationTimestamp="2026-03-07 07:18:15 +0000 UTC" firstStartedPulling="2026-03-07 07:18:16.505291672 +0000 UTC m=+1114.970279013" lastFinishedPulling="2026-03-07 07:18:19.22359342 +0000 UTC m=+1117.688580741" observedRunningTime="2026-03-07 07:18:19.919491092 +0000 UTC m=+1118.384478443" watchObservedRunningTime="2026-03-07 07:18:19.924317592 +0000 UTC m=+1118.389304943" Mar 07 07:18:24 crc kubenswrapper[4738]: I0307 07:18:24.962100 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" event={"ID":"3032dbb3-8ffe-4712-a9b5-a57f22eebe73","Type":"ContainerStarted","Data":"01475dcbebb00264197d30a008c8815477aff1de7227fadf67e28553e5750d55"} Mar 07 07:18:25 crc kubenswrapper[4738]: I0307 07:18:25.842581 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:25 crc kubenswrapper[4738]: I0307 07:18:25.842664 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:25 crc kubenswrapper[4738]: I0307 07:18:25.887560 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:25 crc kubenswrapper[4738]: I0307 07:18:25.910631 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" podStartSLOduration=3.695209413 podStartE2EDuration="7.910604105s" podCreationTimestamp="2026-03-07 07:18:18 +0000 UTC" firstStartedPulling="2026-03-07 07:18:19.627802438 +0000 UTC m=+1118.092789759" lastFinishedPulling="2026-03-07 07:18:23.84319713 +0000 UTC m=+1122.308184451" observedRunningTime="2026-03-07 07:18:24.994593167 +0000 UTC m=+1123.459580498" watchObservedRunningTime="2026-03-07 07:18:25.910604105 +0000 UTC m=+1124.375591476" Mar 07 07:18:26 crc kubenswrapper[4738]: I0307 07:18:26.014116 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-kq2lh" Mar 07 07:18:26 crc kubenswrapper[4738]: E0307 07:18:26.426496 4738 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3032dbb3_8ffe_4712_a9b5_a57f22eebe73.slice/crio-conmon-01475dcbebb00264197d30a008c8815477aff1de7227fadf67e28553e5750d55.scope\": RecentStats: unable to find data in memory cache]" Mar 07 07:18:26 crc kubenswrapper[4738]: I0307 07:18:26.958122 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:18:26 crc kubenswrapper[4738]: I0307 07:18:26.958217 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:18:26 crc kubenswrapper[4738]: I0307 07:18:26.981996 4738 generic.go:334] "Generic (PLEG): container finished" podID="3032dbb3-8ffe-4712-a9b5-a57f22eebe73" containerID="01475dcbebb00264197d30a008c8815477aff1de7227fadf67e28553e5750d55" exitCode=0 Mar 07 07:18:26 crc kubenswrapper[4738]: I0307 07:18:26.982113 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" event={"ID":"3032dbb3-8ffe-4712-a9b5-a57f22eebe73","Type":"ContainerDied","Data":"01475dcbebb00264197d30a008c8815477aff1de7227fadf67e28553e5750d55"} Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.327043 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.401393 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sj9bf\" (UniqueName: \"kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf\") pod \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.401484 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data\") pod \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\" (UID: \"3032dbb3-8ffe-4712-a9b5-a57f22eebe73\") " Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.407712 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3032dbb3-8ffe-4712-a9b5-a57f22eebe73" (UID: "3032dbb3-8ffe-4712-a9b5-a57f22eebe73"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.408329 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf" (OuterVolumeSpecName: "kube-api-access-sj9bf") pod "3032dbb3-8ffe-4712-a9b5-a57f22eebe73" (UID: "3032dbb3-8ffe-4712-a9b5-a57f22eebe73"). InnerVolumeSpecName "kube-api-access-sj9bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.503934 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sj9bf\" (UniqueName: \"kubernetes.io/projected/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-kube-api-access-sj9bf\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:28 crc kubenswrapper[4738]: I0307 07:18:28.504686 4738 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3032dbb3-8ffe-4712-a9b5-a57f22eebe73-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.002929 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" event={"ID":"3032dbb3-8ffe-4712-a9b5-a57f22eebe73","Type":"ContainerDied","Data":"43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308"} Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.002995 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43d380a477386db6a45dca4ffe00574e7db403fcc293f564113b410aaa791308" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.003043 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-db-sync-9lcxq" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.057697 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc"] Mar 07 07:18:29 crc kubenswrapper[4738]: E0307 07:18:29.058717 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3032dbb3-8ffe-4712-a9b5-a57f22eebe73" containerName="barbican-db-sync" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.058768 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3032dbb3-8ffe-4712-a9b5-a57f22eebe73" containerName="barbican-db-sync" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.059196 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3032dbb3-8ffe-4712-a9b5-a57f22eebe73" containerName="barbican-db-sync" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.062045 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.066689 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-fvfwj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.076522 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.115116 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqjx\" (UniqueName: \"kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.115536 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.115627 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.217088 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.217194 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.217270 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqjx\" (UniqueName: \"kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.217832 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.218578 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.246922 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqjx\" (UniqueName: \"kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx\") pod \"7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.395447 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.699502 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.701774 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.709566 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-worker-config-data" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.709736 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-config-data" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.709828 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-barbican-dockercfg-4f75z" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.725778 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.726697 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd356b-08d4-49ca-81c6-37f7f89cc581-logs\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.726776 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data-custom\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.726814 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdcnz\" (UniqueName: \"kubernetes.io/projected/31dd356b-08d4-49ca-81c6-37f7f89cc581-kube-api-access-rdcnz\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.726857 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.740540 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.741782 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.746565 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-api-config-data" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.746848 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.747947 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.749884 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"barbican-keystone-listener-config-data" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.765640 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.774968 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829118 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd356b-08d4-49ca-81c6-37f7f89cc581-logs\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829216 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qkk\" (UniqueName: \"kubernetes.io/projected/85f7ad24-4e32-4833-b113-2f53c5fe23c2-kube-api-access-l5qkk\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829286 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data-custom\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829316 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7ad24-4e32-4833-b113-2f53c5fe23c2-logs\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829383 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data-custom\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829453 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdcnz\" (UniqueName: \"kubernetes.io/projected/31dd356b-08d4-49ca-81c6-37f7f89cc581-kube-api-access-rdcnz\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829511 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829544 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829569 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-logs\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829612 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9qzd\" (UniqueName: \"kubernetes.io/projected/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-kube-api-access-t9qzd\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829645 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829689 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data-custom\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.829791 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/31dd356b-08d4-49ca-81c6-37f7f89cc581-logs\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.837003 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.844266 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdcnz\" (UniqueName: \"kubernetes.io/projected/31dd356b-08d4-49ca-81c6-37f7f89cc581-kube-api-access-rdcnz\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.844775 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/31dd356b-08d4-49ca-81c6-37f7f89cc581-config-data-custom\") pod \"barbican-worker-76fbcdbdf7-hhpvj\" (UID: \"31dd356b-08d4-49ca-81c6-37f7f89cc581\") " pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.894688 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc"] Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931236 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qkk\" (UniqueName: \"kubernetes.io/projected/85f7ad24-4e32-4833-b113-2f53c5fe23c2-kube-api-access-l5qkk\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931291 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data-custom\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931311 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7ad24-4e32-4833-b113-2f53c5fe23c2-logs\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931357 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931380 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931396 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-logs\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931411 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9qzd\" (UniqueName: \"kubernetes.io/projected/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-kube-api-access-t9qzd\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931466 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data-custom\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.931761 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-logs\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.932086 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/85f7ad24-4e32-4833-b113-2f53c5fe23c2-logs\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.934800 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data-custom\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.935220 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data-custom\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.935769 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-config-data\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.937563 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/85f7ad24-4e32-4833-b113-2f53c5fe23c2-config-data\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.947027 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qkk\" (UniqueName: \"kubernetes.io/projected/85f7ad24-4e32-4833-b113-2f53c5fe23c2-kube-api-access-l5qkk\") pod \"barbican-api-57ff6dd9fd-sd8jl\" (UID: \"85f7ad24-4e32-4833-b113-2f53c5fe23c2\") " pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:29 crc kubenswrapper[4738]: I0307 07:18:29.947551 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9qzd\" (UniqueName: \"kubernetes.io/projected/ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e-kube-api-access-t9qzd\") pod \"barbican-keystone-listener-7cddffb874-sgxs6\" (UID: \"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e\") " pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.009754 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" event={"ID":"c2b59545-69a9-477d-857c-2134d53edc2d","Type":"ContainerStarted","Data":"f830a25a8f13394986cc243b2fb50b21d9d8c9572107d7b160f06a9be2028b90"} Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.029782 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.064430 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.070351 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.355661 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj"] Mar 07 07:18:30 crc kubenswrapper[4738]: W0307 07:18:30.359861 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31dd356b_08d4_49ca_81c6_37f7f89cc581.slice/crio-ee30b3433efbac3c8fd9dd359e4abfb4cc88ed257940d7d652743c8c3ec31dac WatchSource:0}: Error finding container ee30b3433efbac3c8fd9dd359e4abfb4cc88ed257940d7d652743c8c3ec31dac: Status 404 returned error can't find the container with id ee30b3433efbac3c8fd9dd359e4abfb4cc88ed257940d7d652743c8c3ec31dac Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.393324 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6"] Mar 07 07:18:30 crc kubenswrapper[4738]: W0307 07:18:30.402061 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podee6a6f2f_a684_4ee3_8cb5_9405fa4f034e.slice/crio-fd91e006555ab4d8aa4da8c141d8fceab68b811f2e655355dba5856f8bc7d8d2 WatchSource:0}: Error finding container fd91e006555ab4d8aa4da8c141d8fceab68b811f2e655355dba5856f8bc7d8d2: Status 404 returned error can't find the container with id fd91e006555ab4d8aa4da8c141d8fceab68b811f2e655355dba5856f8bc7d8d2 Mar 07 07:18:30 crc kubenswrapper[4738]: I0307 07:18:30.452400 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl"] Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.017550 4738 generic.go:334] "Generic (PLEG): container finished" podID="c2b59545-69a9-477d-857c-2134d53edc2d" containerID="fe63c33aefac8201fa6d098f019f191b1f5e3d414e33f324b22a9076b4ec7bfa" exitCode=0 Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.017689 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" event={"ID":"c2b59545-69a9-477d-857c-2134d53edc2d","Type":"ContainerDied","Data":"fe63c33aefac8201fa6d098f019f191b1f5e3d414e33f324b22a9076b4ec7bfa"} Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.019571 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" event={"ID":"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e","Type":"ContainerStarted","Data":"fd91e006555ab4d8aa4da8c141d8fceab68b811f2e655355dba5856f8bc7d8d2"} Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.023284 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" event={"ID":"85f7ad24-4e32-4833-b113-2f53c5fe23c2","Type":"ContainerStarted","Data":"32e645de96262f5c5973e6de3fbd8155c6a5a63e42e88bc2967ac362d9e389cb"} Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.023317 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" event={"ID":"85f7ad24-4e32-4833-b113-2f53c5fe23c2","Type":"ContainerStarted","Data":"5bfdabcc140a655ac47110a937cf939b9ff5dcb3d6995351f645cf6fd8ab0dc7"} Mar 07 07:18:31 crc kubenswrapper[4738]: I0307 07:18:31.028857 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" event={"ID":"31dd356b-08d4-49ca-81c6-37f7f89cc581","Type":"ContainerStarted","Data":"ee30b3433efbac3c8fd9dd359e4abfb4cc88ed257940d7d652743c8c3ec31dac"} Mar 07 07:18:32 crc kubenswrapper[4738]: I0307 07:18:32.073425 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" event={"ID":"85f7ad24-4e32-4833-b113-2f53c5fe23c2","Type":"ContainerStarted","Data":"3f936fd93b8b141b70f118b82e822bf19202de696ba54934db83dc66eb85f24e"} Mar 07 07:18:32 crc kubenswrapper[4738]: I0307 07:18:32.073865 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:32 crc kubenswrapper[4738]: I0307 07:18:32.074080 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:32 crc kubenswrapper[4738]: I0307 07:18:32.115487 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" podStartSLOduration=3.115461043 podStartE2EDuration="3.115461043s" podCreationTimestamp="2026-03-07 07:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:18:32.106179912 +0000 UTC m=+1130.571167253" watchObservedRunningTime="2026-03-07 07:18:32.115461043 +0000 UTC m=+1130.580448374" Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.085736 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" event={"ID":"31dd356b-08d4-49ca-81c6-37f7f89cc581","Type":"ContainerStarted","Data":"b988ad1eff8eb8e2d0cfc6897a2c58b21ef5f7fc3860d3c37f0f072fbf8f9623"} Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.086024 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" event={"ID":"31dd356b-08d4-49ca-81c6-37f7f89cc581","Type":"ContainerStarted","Data":"9511433ea134c77499075ff28492202cdefca4185521925c083f34f27b2ad352"} Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.088197 4738 generic.go:334] "Generic (PLEG): container finished" podID="c2b59545-69a9-477d-857c-2134d53edc2d" containerID="dd0d42affecae1f78f9fed497ea5695f96ac44b115c0cda138beec1842db9b1f" exitCode=0 Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.088251 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" event={"ID":"c2b59545-69a9-477d-857c-2134d53edc2d","Type":"ContainerDied","Data":"dd0d42affecae1f78f9fed497ea5695f96ac44b115c0cda138beec1842db9b1f"} Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.091590 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" event={"ID":"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e","Type":"ContainerStarted","Data":"b761a6064f570c5172bf4b84b1939d6b8cffab72e867b665807987754dcf8175"} Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.091663 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" event={"ID":"ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e","Type":"ContainerStarted","Data":"5bcb58dbbcf46f732d404338acfd3a68195db814afda23e49ddd98b250ab7073"} Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.113959 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-worker-76fbcdbdf7-hhpvj" podStartSLOduration=2.6705016329999998 podStartE2EDuration="4.113940293s" podCreationTimestamp="2026-03-07 07:18:29 +0000 UTC" firstStartedPulling="2026-03-07 07:18:30.364199443 +0000 UTC m=+1128.829186764" lastFinishedPulling="2026-03-07 07:18:31.807638093 +0000 UTC m=+1130.272625424" observedRunningTime="2026-03-07 07:18:33.108808474 +0000 UTC m=+1131.573795835" watchObservedRunningTime="2026-03-07 07:18:33.113940293 +0000 UTC m=+1131.578927624" Mar 07 07:18:33 crc kubenswrapper[4738]: I0307 07:18:33.159202 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/barbican-keystone-listener-7cddffb874-sgxs6" podStartSLOduration=2.746890789 podStartE2EDuration="4.159177327s" podCreationTimestamp="2026-03-07 07:18:29 +0000 UTC" firstStartedPulling="2026-03-07 07:18:30.408669626 +0000 UTC m=+1128.873656947" lastFinishedPulling="2026-03-07 07:18:31.820956154 +0000 UTC m=+1130.285943485" observedRunningTime="2026-03-07 07:18:33.140222084 +0000 UTC m=+1131.605209445" watchObservedRunningTime="2026-03-07 07:18:33.159177327 +0000 UTC m=+1131.624164648" Mar 07 07:18:34 crc kubenswrapper[4738]: I0307 07:18:34.104123 4738 generic.go:334] "Generic (PLEG): container finished" podID="c2b59545-69a9-477d-857c-2134d53edc2d" containerID="3d34a47f491f14010cc9402d79e4f08093d2234e69bcd407fc38854d3dbda04c" exitCode=0 Mar 07 07:18:34 crc kubenswrapper[4738]: I0307 07:18:34.104212 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" event={"ID":"c2b59545-69a9-477d-857c-2134d53edc2d","Type":"ContainerDied","Data":"3d34a47f491f14010cc9402d79e4f08093d2234e69bcd407fc38854d3dbda04c"} Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.412963 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.523843 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfqjx\" (UniqueName: \"kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx\") pod \"c2b59545-69a9-477d-857c-2134d53edc2d\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.523964 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle\") pod \"c2b59545-69a9-477d-857c-2134d53edc2d\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.524004 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util\") pod \"c2b59545-69a9-477d-857c-2134d53edc2d\" (UID: \"c2b59545-69a9-477d-857c-2134d53edc2d\") " Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.525327 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle" (OuterVolumeSpecName: "bundle") pod "c2b59545-69a9-477d-857c-2134d53edc2d" (UID: "c2b59545-69a9-477d-857c-2134d53edc2d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.532545 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx" (OuterVolumeSpecName: "kube-api-access-mfqjx") pod "c2b59545-69a9-477d-857c-2134d53edc2d" (UID: "c2b59545-69a9-477d-857c-2134d53edc2d"). InnerVolumeSpecName "kube-api-access-mfqjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.625458 4738 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-bundle\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.625776 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfqjx\" (UniqueName: \"kubernetes.io/projected/c2b59545-69a9-477d-857c-2134d53edc2d-kube-api-access-mfqjx\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.792246 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util" (OuterVolumeSpecName: "util") pod "c2b59545-69a9-477d-857c-2134d53edc2d" (UID: "c2b59545-69a9-477d-857c-2134d53edc2d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:18:35 crc kubenswrapper[4738]: I0307 07:18:35.828630 4738 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c2b59545-69a9-477d-857c-2134d53edc2d-util\") on node \"crc\" DevicePath \"\"" Mar 07 07:18:36 crc kubenswrapper[4738]: I0307 07:18:36.125940 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" event={"ID":"c2b59545-69a9-477d-857c-2134d53edc2d","Type":"ContainerDied","Data":"f830a25a8f13394986cc243b2fb50b21d9d8c9572107d7b160f06a9be2028b90"} Mar 07 07:18:36 crc kubenswrapper[4738]: I0307 07:18:36.126268 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc" Mar 07 07:18:36 crc kubenswrapper[4738]: I0307 07:18:36.126268 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f830a25a8f13394986cc243b2fb50b21d9d8c9572107d7b160f06a9be2028b90" Mar 07 07:18:36 crc kubenswrapper[4738]: I0307 07:18:36.467115 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:37 crc kubenswrapper[4738]: I0307 07:18:37.760600 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/barbican-api-57ff6dd9fd-sd8jl" Mar 07 07:18:44 crc kubenswrapper[4738]: I0307 07:18:44.058950 4738 scope.go:117] "RemoveContainer" containerID="f4f63b30ce670e3b9dbd6410ebe2380cbec74d33da37f92238589e03f79eb043" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.400572 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4"] Mar 07 07:18:49 crc kubenswrapper[4738]: E0307 07:18:49.401577 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="extract" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.401601 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="extract" Mar 07 07:18:49 crc kubenswrapper[4738]: E0307 07:18:49.401625 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="pull" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.401636 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="pull" Mar 07 07:18:49 crc kubenswrapper[4738]: E0307 07:18:49.401662 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="util" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.401672 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="util" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.401900 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2b59545-69a9-477d-857c-2134d53edc2d" containerName="extract" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.402908 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.406083 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.406741 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-gx5ps" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.420933 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4"] Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.539762 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-webhook-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.539851 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-apiservice-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.539901 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk7hz\" (UniqueName: \"kubernetes.io/projected/86f1965b-c0fd-4540-909f-301789d064af-kube-api-access-dk7hz\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.641571 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-webhook-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.641632 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-apiservice-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.641678 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk7hz\" (UniqueName: \"kubernetes.io/projected/86f1965b-c0fd-4540-909f-301789d064af-kube-api-access-dk7hz\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.650414 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-webhook-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.650520 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/86f1965b-c0fd-4540-909f-301789d064af-apiservice-cert\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.662506 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk7hz\" (UniqueName: \"kubernetes.io/projected/86f1965b-c0fd-4540-909f-301789d064af-kube-api-access-dk7hz\") pod \"swift-operator-controller-manager-5f949bdfdb-rrqv4\" (UID: \"86f1965b-c0fd-4540-909f-301789d064af\") " pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:49 crc kubenswrapper[4738]: I0307 07:18:49.724175 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:50 crc kubenswrapper[4738]: I0307 07:18:50.315137 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4"] Mar 07 07:18:50 crc kubenswrapper[4738]: W0307 07:18:50.331801 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f1965b_c0fd_4540_909f_301789d064af.slice/crio-6012a5b3dbc4d7f2c88e2c8773c3c22acd4ea081679d5323ceb71db15304f64c WatchSource:0}: Error finding container 6012a5b3dbc4d7f2c88e2c8773c3c22acd4ea081679d5323ceb71db15304f64c: Status 404 returned error can't find the container with id 6012a5b3dbc4d7f2c88e2c8773c3c22acd4ea081679d5323ceb71db15304f64c Mar 07 07:18:51 crc kubenswrapper[4738]: I0307 07:18:51.290107 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" event={"ID":"86f1965b-c0fd-4540-909f-301789d064af","Type":"ContainerStarted","Data":"6012a5b3dbc4d7f2c88e2c8773c3c22acd4ea081679d5323ceb71db15304f64c"} Mar 07 07:18:53 crc kubenswrapper[4738]: I0307 07:18:53.323304 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" event={"ID":"86f1965b-c0fd-4540-909f-301789d064af","Type":"ContainerStarted","Data":"d15ae50992035baae3dbe83c15c2d61715668fd22b12d2de955421c8a7d66608"} Mar 07 07:18:53 crc kubenswrapper[4738]: I0307 07:18:53.323979 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:18:53 crc kubenswrapper[4738]: I0307 07:18:53.349049 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" podStartSLOduration=1.562699517 podStartE2EDuration="4.349014127s" podCreationTimestamp="2026-03-07 07:18:49 +0000 UTC" firstStartedPulling="2026-03-07 07:18:50.336502096 +0000 UTC m=+1148.801489407" lastFinishedPulling="2026-03-07 07:18:53.122816706 +0000 UTC m=+1151.587804017" observedRunningTime="2026-03-07 07:18:53.342515841 +0000 UTC m=+1151.807503202" watchObservedRunningTime="2026-03-07 07:18:53.349014127 +0000 UTC m=+1151.814001488" Mar 07 07:18:56 crc kubenswrapper[4738]: I0307 07:18:56.957815 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:18:56 crc kubenswrapper[4738]: I0307 07:18:56.958557 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:18:56 crc kubenswrapper[4738]: I0307 07:18:56.958618 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:18:56 crc kubenswrapper[4738]: I0307 07:18:56.959259 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:18:56 crc kubenswrapper[4738]: I0307 07:18:56.959325 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7" gracePeriod=600 Mar 07 07:18:57 crc kubenswrapper[4738]: I0307 07:18:57.359190 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7" exitCode=0 Mar 07 07:18:57 crc kubenswrapper[4738]: I0307 07:18:57.359265 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7"} Mar 07 07:18:57 crc kubenswrapper[4738]: I0307 07:18:57.359595 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246"} Mar 07 07:18:57 crc kubenswrapper[4738]: I0307 07:18:57.359623 4738 scope.go:117] "RemoveContainer" containerID="243b9193ac6c5a75ee43ac39995d1462b4eb26a64963d0af07b51ef1dcd6c0f7" Mar 07 07:18:59 crc kubenswrapper[4738]: I0307 07:18:59.731711 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f949bdfdb-rrqv4" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.068907 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.073343 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.077362 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.077528 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-cj9mf" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.077561 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.077769 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.105346 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.180806 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.180892 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zmv\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.180981 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.181017 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.181046 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.282839 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7zmv\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.283112 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.283292 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.283312 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.283383 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift podName:96ceb9b4-e0fc-488b-8ef4-e7092268ee23 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:05.783360609 +0000 UTC m=+1164.248347960 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift") pod "swift-storage-0" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23") : configmap "swift-ring-files" not found Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.283579 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.283664 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.283746 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.284238 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") device mount path \"/mnt/openstack/pv07\"" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.287348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.287723 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.310189 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7zmv\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.316212 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: I0307 07:19:05.795643 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.795866 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.796139 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:19:05 crc kubenswrapper[4738]: E0307 07:19:05.796253 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift podName:96ceb9b4-e0fc-488b-8ef4-e7092268ee23 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:06.796227677 +0000 UTC m=+1165.261215028 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift") pod "swift-storage-0" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23") : configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.035293 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.038008 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.043211 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.057951 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.101423 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb4fk\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.101497 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.101546 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.101585 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.101608 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.202773 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.202869 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.202906 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.202956 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb4fk\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.203001 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.203136 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.203181 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-rsw5w: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.203240 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift podName:d469c92b-a4c2-488c-b38d-fbc142aaa0f0 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:06.70322032 +0000 UTC m=+1165.168207641 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift") pod "swift-proxy-76c998454c-rsw5w" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0") : configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.203592 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.203802 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.208883 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.232849 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb4fk\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.709192 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.709400 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.709507 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-rsw5w: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.709569 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift podName:d469c92b-a4c2-488c-b38d-fbc142aaa0f0 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:07.70954762 +0000 UTC m=+1166.174534941 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift") pod "swift-proxy-76c998454c-rsw5w" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0") : configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: I0307 07:19:06.810333 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.810526 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.810550 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:19:06 crc kubenswrapper[4738]: E0307 07:19:06.810603 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift podName:96ceb9b4-e0fc-488b-8ef4-e7092268ee23 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:08.810586717 +0000 UTC m=+1167.275574038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift") pod "swift-storage-0" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23") : configmap "swift-ring-files" not found Mar 07 07:19:07 crc kubenswrapper[4738]: I0307 07:19:07.725463 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:07 crc kubenswrapper[4738]: E0307 07:19:07.725633 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:07 crc kubenswrapper[4738]: E0307 07:19:07.725781 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-rsw5w: configmap "swift-ring-files" not found Mar 07 07:19:07 crc kubenswrapper[4738]: E0307 07:19:07.725831 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift podName:d469c92b-a4c2-488c-b38d-fbc142aaa0f0 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:09.725815838 +0000 UTC m=+1168.190803159 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift") pod "swift-proxy-76c998454c-rsw5w" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0") : configmap "swift-ring-files" not found Mar 07 07:19:08 crc kubenswrapper[4738]: I0307 07:19:08.855366 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:08 crc kubenswrapper[4738]: E0307 07:19:08.855524 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:08 crc kubenswrapper[4738]: E0307 07:19:08.856419 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:19:08 crc kubenswrapper[4738]: E0307 07:19:08.856577 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift podName:96ceb9b4-e0fc-488b-8ef4-e7092268ee23 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:12.856555789 +0000 UTC m=+1171.321543120 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift") pod "swift-storage-0" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23") : configmap "swift-ring-files" not found Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.041622 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-lpmzc"] Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.042797 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.045245 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.045611 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.065784 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-lpmzc"] Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.161679 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lrmk\" (UniqueName: \"kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.161733 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.161858 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.161937 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.161963 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.162005 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.263975 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.264039 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.264095 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.264189 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lrmk\" (UniqueName: \"kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.264225 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.264333 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.265296 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.265399 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.265673 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.270224 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.270571 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.286365 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lrmk\" (UniqueName: \"kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk\") pod \"swift-ring-rebalance-lpmzc\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.364553 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.772011 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:09 crc kubenswrapper[4738]: E0307 07:19:09.772270 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:09 crc kubenswrapper[4738]: E0307 07:19:09.772556 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-rsw5w: configmap "swift-ring-files" not found Mar 07 07:19:09 crc kubenswrapper[4738]: E0307 07:19:09.772638 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift podName:d469c92b-a4c2-488c-b38d-fbc142aaa0f0 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:13.772609853 +0000 UTC m=+1172.237597184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift") pod "swift-proxy-76c998454c-rsw5w" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0") : configmap "swift-ring-files" not found Mar 07 07:19:09 crc kubenswrapper[4738]: I0307 07:19:09.817711 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-lpmzc"] Mar 07 07:19:10 crc kubenswrapper[4738]: I0307 07:19:10.472431 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" event={"ID":"0910647e-f5c9-40b1-9f16-499cb8483b8d","Type":"ContainerStarted","Data":"2ecf8734aab6411aa29f200fb2e50326526570a5170da9d7699e3c2eb4ec6adc"} Mar 07 07:19:12 crc kubenswrapper[4738]: I0307 07:19:12.937004 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:12 crc kubenswrapper[4738]: E0307 07:19:12.937318 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:12 crc kubenswrapper[4738]: E0307 07:19:12.937726 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:19:12 crc kubenswrapper[4738]: E0307 07:19:12.937823 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift podName:96ceb9b4-e0fc-488b-8ef4-e7092268ee23 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:20.937786692 +0000 UTC m=+1179.402774063 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift") pod "swift-storage-0" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23") : configmap "swift-ring-files" not found Mar 07 07:19:13 crc kubenswrapper[4738]: I0307 07:19:13.512535 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" event={"ID":"0910647e-f5c9-40b1-9f16-499cb8483b8d","Type":"ContainerStarted","Data":"7a9f320778051ddf4b7af67e32ddd382aad3976afba27bea7e44cc7f637a1ec6"} Mar 07 07:19:13 crc kubenswrapper[4738]: I0307 07:19:13.543319 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" podStartSLOduration=1.6879329159999998 podStartE2EDuration="4.543298383s" podCreationTimestamp="2026-03-07 07:19:09 +0000 UTC" firstStartedPulling="2026-03-07 07:19:09.820038407 +0000 UTC m=+1168.285025738" lastFinishedPulling="2026-03-07 07:19:12.675403884 +0000 UTC m=+1171.140391205" observedRunningTime="2026-03-07 07:19:13.539196791 +0000 UTC m=+1172.004184152" watchObservedRunningTime="2026-03-07 07:19:13.543298383 +0000 UTC m=+1172.008285714" Mar 07 07:19:13 crc kubenswrapper[4738]: I0307 07:19:13.854682 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:13 crc kubenswrapper[4738]: E0307 07:19:13.854921 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:19:13 crc kubenswrapper[4738]: E0307 07:19:13.854956 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-proxy-76c998454c-rsw5w: configmap "swift-ring-files" not found Mar 07 07:19:13 crc kubenswrapper[4738]: E0307 07:19:13.855032 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift podName:d469c92b-a4c2-488c-b38d-fbc142aaa0f0 nodeName:}" failed. No retries permitted until 2026-03-07 07:19:21.855008088 +0000 UTC m=+1180.319995439 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift") pod "swift-proxy-76c998454c-rsw5w" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0") : configmap "swift-ring-files" not found Mar 07 07:19:19 crc kubenswrapper[4738]: I0307 07:19:19.568854 4738 generic.go:334] "Generic (PLEG): container finished" podID="0910647e-f5c9-40b1-9f16-499cb8483b8d" containerID="7a9f320778051ddf4b7af67e32ddd382aad3976afba27bea7e44cc7f637a1ec6" exitCode=0 Mar 07 07:19:19 crc kubenswrapper[4738]: I0307 07:19:19.568978 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" event={"ID":"0910647e-f5c9-40b1-9f16-499cb8483b8d","Type":"ContainerDied","Data":"7a9f320778051ddf4b7af67e32ddd382aad3976afba27bea7e44cc7f637a1ec6"} Mar 07 07:19:20 crc kubenswrapper[4738]: I0307 07:19:20.988781 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:20 crc kubenswrapper[4738]: I0307 07:19:20.992846 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.004278 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"swift-storage-0\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.094277 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.094466 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.094547 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.094853 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.094951 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.095003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lrmk\" (UniqueName: \"kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk\") pod \"0910647e-f5c9-40b1-9f16-499cb8483b8d\" (UID: \"0910647e-f5c9-40b1-9f16-499cb8483b8d\") " Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.095426 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.095563 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.096366 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.100708 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk" (OuterVolumeSpecName: "kube-api-access-6lrmk") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "kube-api-access-6lrmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.113194 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.121205 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts" (OuterVolumeSpecName: "scripts") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.134354 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0910647e-f5c9-40b1-9f16-499cb8483b8d" (UID: "0910647e-f5c9-40b1-9f16-499cb8483b8d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.196750 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0910647e-f5c9-40b1-9f16-499cb8483b8d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.196780 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.196790 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0910647e-f5c9-40b1-9f16-499cb8483b8d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.196816 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0910647e-f5c9-40b1-9f16-499cb8483b8d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.196827 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lrmk\" (UniqueName: \"kubernetes.io/projected/0910647e-f5c9-40b1-9f16-499cb8483b8d-kube-api-access-6lrmk\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.298230 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.583089 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" event={"ID":"0910647e-f5c9-40b1-9f16-499cb8483b8d","Type":"ContainerDied","Data":"2ecf8734aab6411aa29f200fb2e50326526570a5170da9d7699e3c2eb4ec6adc"} Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.583127 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ecf8734aab6411aa29f200fb2e50326526570a5170da9d7699e3c2eb4ec6adc" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.583193 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-lpmzc" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.784317 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:19:21 crc kubenswrapper[4738]: W0307 07:19:21.800512 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96ceb9b4_e0fc_488b_8ef4_e7092268ee23.slice/crio-e4672af42ab4a31dc2864a5375b24527b0d4847a8526040e4c1e994f0b584462 WatchSource:0}: Error finding container e4672af42ab4a31dc2864a5375b24527b0d4847a8526040e4c1e994f0b584462: Status 404 returned error can't find the container with id e4672af42ab4a31dc2864a5375b24527b0d4847a8526040e4c1e994f0b584462 Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.822559 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.907070 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.913530 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"swift-proxy-76c998454c-rsw5w\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:21 crc kubenswrapper[4738]: I0307 07:19:21.959371 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:22 crc kubenswrapper[4738]: I0307 07:19:22.423364 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:22 crc kubenswrapper[4738]: W0307 07:19:22.440079 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd469c92b_a4c2_488c_b38d_fbc142aaa0f0.slice/crio-854556c446e68d0325a0740f1187e6ec503bf1a7f9c40fa5402b2e122a9b54ba WatchSource:0}: Error finding container 854556c446e68d0325a0740f1187e6ec503bf1a7f9c40fa5402b2e122a9b54ba: Status 404 returned error can't find the container with id 854556c446e68d0325a0740f1187e6ec503bf1a7f9c40fa5402b2e122a9b54ba Mar 07 07:19:22 crc kubenswrapper[4738]: I0307 07:19:22.599568 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerStarted","Data":"854556c446e68d0325a0740f1187e6ec503bf1a7f9c40fa5402b2e122a9b54ba"} Mar 07 07:19:22 crc kubenswrapper[4738]: I0307 07:19:22.600839 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"e4672af42ab4a31dc2864a5375b24527b0d4847a8526040e4c1e994f0b584462"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.508029 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.609130 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerStarted","Data":"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.609211 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerStarted","Data":"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.609257 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.609648 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.612141 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.612195 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.612213 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.612228 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c"} Mar 07 07:19:23 crc kubenswrapper[4738]: I0307 07:19:23.627475 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" podStartSLOduration=17.627459871 podStartE2EDuration="17.627459871s" podCreationTimestamp="2026-03-07 07:19:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:19:23.626193657 +0000 UTC m=+1182.091181008" watchObservedRunningTime="2026-03-07 07:19:23.627459871 +0000 UTC m=+1182.092447192" Mar 07 07:19:25 crc kubenswrapper[4738]: I0307 07:19:25.195518 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:25 crc kubenswrapper[4738]: I0307 07:19:25.641330 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d"} Mar 07 07:19:25 crc kubenswrapper[4738]: I0307 07:19:25.641698 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa"} Mar 07 07:19:25 crc kubenswrapper[4738]: I0307 07:19:25.641722 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97"} Mar 07 07:19:25 crc kubenswrapper[4738]: I0307 07:19:25.641739 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720"} Mar 07 07:19:26 crc kubenswrapper[4738]: I0307 07:19:26.663052 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837"} Mar 07 07:19:26 crc kubenswrapper[4738]: I0307 07:19:26.879108 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:27 crc kubenswrapper[4738]: I0307 07:19:27.684786 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802"} Mar 07 07:19:27 crc kubenswrapper[4738]: I0307 07:19:27.684829 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df"} Mar 07 07:19:27 crc kubenswrapper[4738]: I0307 07:19:27.684840 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0"} Mar 07 07:19:28 crc kubenswrapper[4738]: I0307 07:19:28.540652 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:28 crc kubenswrapper[4738]: I0307 07:19:28.703843 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287"} Mar 07 07:19:28 crc kubenswrapper[4738]: I0307 07:19:28.703905 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333"} Mar 07 07:19:28 crc kubenswrapper[4738]: I0307 07:19:28.703919 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerStarted","Data":"90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9"} Mar 07 07:19:28 crc kubenswrapper[4738]: I0307 07:19:28.807710 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=20.323510423 podStartE2EDuration="24.80768823s" podCreationTimestamp="2026-03-07 07:19:04 +0000 UTC" firstStartedPulling="2026-03-07 07:19:21.805406598 +0000 UTC m=+1180.270393969" lastFinishedPulling="2026-03-07 07:19:26.289584455 +0000 UTC m=+1184.754571776" observedRunningTime="2026-03-07 07:19:28.788938479 +0000 UTC m=+1187.253925810" watchObservedRunningTime="2026-03-07 07:19:28.80768823 +0000 UTC m=+1187.272675561" Mar 07 07:19:30 crc kubenswrapper[4738]: I0307 07:19:30.701041 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:31 crc kubenswrapper[4738]: I0307 07:19:31.964114 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:31 crc kubenswrapper[4738]: I0307 07:19:31.965330 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:32 crc kubenswrapper[4738]: I0307 07:19:32.377301 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:34 crc kubenswrapper[4738]: I0307 07:19:34.031032 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-lpmzc_0910647e-f5c9-40b1-9f16-499cb8483b8d/swift-ring-rebalance/0.log" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.469457 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:19:35 crc kubenswrapper[4738]: E0307 07:19:35.470403 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0910647e-f5c9-40b1-9f16-499cb8483b8d" containerName="swift-ring-rebalance" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.470433 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0910647e-f5c9-40b1-9f16-499cb8483b8d" containerName="swift-ring-rebalance" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.470713 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0910647e-f5c9-40b1-9f16-499cb8483b8d" containerName="swift-ring-rebalance" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.478380 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.484486 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.493059 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.525582 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.532762 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.622292 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-lpmzc"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.637211 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-lpmzc"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.662454 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-q9gvn"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.663686 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667250 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667315 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667361 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667385 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m69bs\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667425 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667459 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667596 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667642 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667664 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.667682 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkhjr\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.675339 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.675854 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.685281 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-q9gvn"] Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.768865 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.768911 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m69bs\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.768945 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.768971 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.768990 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmd9\" (UniqueName: \"kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769020 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769036 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769057 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769073 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769098 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769131 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769169 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769188 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkhjr\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769208 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769223 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769245 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769277 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") device mount path \"/mnt/openstack/pv09\"" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769882 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.769917 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.770312 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.773744 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.774967 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.776220 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.782309 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.788758 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.789413 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.803031 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m69bs\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs\") pod \"swift-storage-1\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.805281 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkhjr\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr\") pod \"swift-storage-2\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.817453 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.851979 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870139 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmd9\" (UniqueName: \"kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870231 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870252 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.870348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.871178 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.871292 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.871817 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.874653 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.878630 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.891023 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmd9\" (UniqueName: \"kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9\") pod \"swift-ring-rebalance-q9gvn\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:35 crc kubenswrapper[4738]: I0307 07:19:35.976644 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.266331 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.344296 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.399498 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0910647e-f5c9-40b1-9f16-499cb8483b8d" path="/var/lib/kubelet/pods/0910647e-f5c9-40b1-9f16-499cb8483b8d/volumes" Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.414505 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-q9gvn"] Mar 07 07:19:36 crc kubenswrapper[4738]: W0307 07:19:36.423829 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod022c5cfe_d18f_420f_8f7d_ac95c111dc54.slice/crio-792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c WatchSource:0}: Error finding container 792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c: Status 404 returned error can't find the container with id 792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.782462 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" event={"ID":"022c5cfe-d18f-420f-8f7d-ac95c111dc54","Type":"ContainerStarted","Data":"5b8dd2431b82438d7e854cccd8e8b8fcf937aa0db16eb2239920149686134af0"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.782721 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" event={"ID":"022c5cfe-d18f-420f-8f7d-ac95c111dc54","Type":"ContainerStarted","Data":"792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.787129 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.787177 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.787189 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"0731eabda154de1456330e6822fe2448acf6b44750560d7f60f385949f040298"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.788901 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.788924 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.788934 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"3414f1f3c96226ea3a3659df89d64de3ccb472634fc4c0164e01395a152d5c9c"} Mar 07 07:19:36 crc kubenswrapper[4738]: I0307 07:19:36.804552 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" podStartSLOduration=1.804533818 podStartE2EDuration="1.804533818s" podCreationTimestamp="2026-03-07 07:19:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:19:36.79948895 +0000 UTC m=+1195.264476281" watchObservedRunningTime="2026-03-07 07:19:36.804533818 +0000 UTC m=+1195.269521139" Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.802239 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.802636 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.802651 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.802662 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.802671 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.825654 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.825708 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.825720 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.825730 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59"} Mar 07 07:19:37 crc kubenswrapper[4738]: I0307 07:19:37.825778 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856206 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856513 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856523 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856532 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856541 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.856549 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877718 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877769 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877786 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877797 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877809 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81"} Mar 07 07:19:38 crc kubenswrapper[4738]: I0307 07:19:38.877820 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f"} Mar 07 07:19:39 crc kubenswrapper[4738]: I0307 07:19:39.894051 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737"} Mar 07 07:19:39 crc kubenswrapper[4738]: I0307 07:19:39.894101 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerStarted","Data":"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391"} Mar 07 07:19:39 crc kubenswrapper[4738]: I0307 07:19:39.904926 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9"} Mar 07 07:19:39 crc kubenswrapper[4738]: I0307 07:19:39.904971 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerStarted","Data":"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14"} Mar 07 07:19:40 crc kubenswrapper[4738]: I0307 07:19:40.015346 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=6.015321842 podStartE2EDuration="6.015321842s" podCreationTimestamp="2026-03-07 07:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:19:39.952425856 +0000 UTC m=+1198.417413187" watchObservedRunningTime="2026-03-07 07:19:40.015321842 +0000 UTC m=+1198.480309173" Mar 07 07:19:40 crc kubenswrapper[4738]: I0307 07:19:40.019727 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=6.019711122 podStartE2EDuration="6.019711122s" podCreationTimestamp="2026-03-07 07:19:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:19:40.011247681 +0000 UTC m=+1198.476234992" watchObservedRunningTime="2026-03-07 07:19:40.019711122 +0000 UTC m=+1198.484698453" Mar 07 07:19:45 crc kubenswrapper[4738]: I0307 07:19:45.963110 4738 generic.go:334] "Generic (PLEG): container finished" podID="022c5cfe-d18f-420f-8f7d-ac95c111dc54" containerID="5b8dd2431b82438d7e854cccd8e8b8fcf937aa0db16eb2239920149686134af0" exitCode=0 Mar 07 07:19:45 crc kubenswrapper[4738]: I0307 07:19:45.963179 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" event={"ID":"022c5cfe-d18f-420f-8f7d-ac95c111dc54","Type":"ContainerDied","Data":"5b8dd2431b82438d7e854cccd8e8b8fcf937aa0db16eb2239920149686134af0"} Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.279323 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.358842 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vfmd9\" (UniqueName: \"kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.359297 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.359317 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.359356 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.359439 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.359460 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts\") pod \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\" (UID: \"022c5cfe-d18f-420f-8f7d-ac95c111dc54\") " Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.361750 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.361846 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.368279 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9" (OuterVolumeSpecName: "kube-api-access-vfmd9") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "kube-api-access-vfmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.383004 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts" (OuterVolumeSpecName: "scripts") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.391239 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.404127 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "022c5cfe-d18f-420f-8f7d-ac95c111dc54" (UID: "022c5cfe-d18f-420f-8f7d-ac95c111dc54"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.461978 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.462013 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/022c5cfe-d18f-420f-8f7d-ac95c111dc54-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.462024 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.462034 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vfmd9\" (UniqueName: \"kubernetes.io/projected/022c5cfe-d18f-420f-8f7d-ac95c111dc54-kube-api-access-vfmd9\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.462047 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/022c5cfe-d18f-420f-8f7d-ac95c111dc54-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.462059 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/022c5cfe-d18f-420f-8f7d-ac95c111dc54-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.981521 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" event={"ID":"022c5cfe-d18f-420f-8f7d-ac95c111dc54","Type":"ContainerDied","Data":"792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c"} Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.981561 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="792cfbe281cba3ea742ec4effec783edea00803e9ad39003c18f078d2694b30c" Mar 07 07:19:47 crc kubenswrapper[4738]: I0307 07:19:47.981650 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-q9gvn" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.324338 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw"] Mar 07 07:19:48 crc kubenswrapper[4738]: E0307 07:19:48.324714 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="022c5cfe-d18f-420f-8f7d-ac95c111dc54" containerName="swift-ring-rebalance" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.324737 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="022c5cfe-d18f-420f-8f7d-ac95c111dc54" containerName="swift-ring-rebalance" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.324914 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="022c5cfe-d18f-420f-8f7d-ac95c111dc54" containerName="swift-ring-rebalance" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.328818 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.332412 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.332627 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.336855 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw"] Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375207 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375273 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375308 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375336 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375380 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5r9pz\" (UniqueName: \"kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.375428 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476512 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476574 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476629 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476662 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476690 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.476741 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5r9pz\" (UniqueName: \"kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.477425 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.477815 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.478509 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.483127 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.489589 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.494141 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5r9pz\" (UniqueName: \"kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz\") pod \"swift-ring-rebalance-debug-8b5jw\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:48 crc kubenswrapper[4738]: I0307 07:19:48.654085 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:49 crc kubenswrapper[4738]: I0307 07:19:49.108989 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw"] Mar 07 07:19:49 crc kubenswrapper[4738]: W0307 07:19:49.112369 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod844406a3_2357_4470_842a_8ca3ae135554.slice/crio-5abdc296c3650bb5831a83f73c4089443d7b9b77f70a3d34c53fc26d0eb72d0b WatchSource:0}: Error finding container 5abdc296c3650bb5831a83f73c4089443d7b9b77f70a3d34c53fc26d0eb72d0b: Status 404 returned error can't find the container with id 5abdc296c3650bb5831a83f73c4089443d7b9b77f70a3d34c53fc26d0eb72d0b Mar 07 07:19:50 crc kubenswrapper[4738]: I0307 07:19:50.000087 4738 generic.go:334] "Generic (PLEG): container finished" podID="844406a3-2357-4470-842a-8ca3ae135554" containerID="32a4f85265e0c73fbb7ad1fbe98d17eb0ca1f084844eb26dab604633b6a6bff3" exitCode=0 Mar 07 07:19:50 crc kubenswrapper[4738]: I0307 07:19:50.000209 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" event={"ID":"844406a3-2357-4470-842a-8ca3ae135554","Type":"ContainerDied","Data":"32a4f85265e0c73fbb7ad1fbe98d17eb0ca1f084844eb26dab604633b6a6bff3"} Mar 07 07:19:50 crc kubenswrapper[4738]: I0307 07:19:50.001070 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" event={"ID":"844406a3-2357-4470-842a-8ca3ae135554","Type":"ContainerStarted","Data":"5abdc296c3650bb5831a83f73c4089443d7b9b77f70a3d34c53fc26d0eb72d0b"} Mar 07 07:19:50 crc kubenswrapper[4738]: I0307 07:19:50.055064 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw"] Mar 07 07:19:50 crc kubenswrapper[4738]: I0307 07:19:50.061720 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw"] Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.327486 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423122 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423414 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423465 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5r9pz\" (UniqueName: \"kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423519 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423586 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.423624 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf\") pod \"844406a3-2357-4470-842a-8ca3ae135554\" (UID: \"844406a3-2357-4470-842a-8ca3ae135554\") " Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.424048 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.424397 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.425035 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.431131 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz" (OuterVolumeSpecName: "kube-api-access-5r9pz") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "kube-api-access-5r9pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.449484 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts" (OuterVolumeSpecName: "scripts") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.463340 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.464338 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "844406a3-2357-4470-842a-8ca3ae135554" (UID: "844406a3-2357-4470-842a-8ca3ae135554"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.519835 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n"] Mar 07 07:19:51 crc kubenswrapper[4738]: E0307 07:19:51.520742 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="844406a3-2357-4470-842a-8ca3ae135554" containerName="swift-ring-rebalance" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.520763 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="844406a3-2357-4470-842a-8ca3ae135554" containerName="swift-ring-rebalance" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.521012 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="844406a3-2357-4470-842a-8ca3ae135554" containerName="swift-ring-rebalance" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.521483 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.525906 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/844406a3-2357-4470-842a-8ca3ae135554-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.525935 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.525946 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5r9pz\" (UniqueName: \"kubernetes.io/projected/844406a3-2357-4470-842a-8ca3ae135554-kube-api-access-5r9pz\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.525956 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/844406a3-2357-4470-842a-8ca3ae135554-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.525966 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/844406a3-2357-4470-842a-8ca3ae135554-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.528743 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n"] Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.627822 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.627940 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.627978 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhswd\" (UniqueName: \"kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.628057 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.628133 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.628341 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730406 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730495 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730524 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhswd\" (UniqueName: \"kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730575 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730598 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.730653 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.731748 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.731792 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.732482 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.735101 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.735116 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.753895 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhswd\" (UniqueName: \"kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd\") pod \"swift-ring-rebalance-debug-xzd8n\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:51 crc kubenswrapper[4738]: I0307 07:19:51.855686 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:52 crc kubenswrapper[4738]: I0307 07:19:52.027913 4738 scope.go:117] "RemoveContainer" containerID="32a4f85265e0c73fbb7ad1fbe98d17eb0ca1f084844eb26dab604633b6a6bff3" Mar 07 07:19:52 crc kubenswrapper[4738]: I0307 07:19:52.027973 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8b5jw" Mar 07 07:19:52 crc kubenswrapper[4738]: I0307 07:19:52.379862 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n"] Mar 07 07:19:52 crc kubenswrapper[4738]: W0307 07:19:52.389336 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21d7a3d8_a912_4470_93d7_9d12e38db487.slice/crio-a7ff612e681685ef00828811d27a63d52d4e2cbdd1c1286f8d4d0625d63f5c48 WatchSource:0}: Error finding container a7ff612e681685ef00828811d27a63d52d4e2cbdd1c1286f8d4d0625d63f5c48: Status 404 returned error can't find the container with id a7ff612e681685ef00828811d27a63d52d4e2cbdd1c1286f8d4d0625d63f5c48 Mar 07 07:19:52 crc kubenswrapper[4738]: I0307 07:19:52.397784 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="844406a3-2357-4470-842a-8ca3ae135554" path="/var/lib/kubelet/pods/844406a3-2357-4470-842a-8ca3ae135554/volumes" Mar 07 07:19:53 crc kubenswrapper[4738]: I0307 07:19:53.037084 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" event={"ID":"21d7a3d8-a912-4470-93d7-9d12e38db487","Type":"ContainerStarted","Data":"a1119f150a952b18a183cdcd8ca742866959d1872e944347aadf7a64dee76b9b"} Mar 07 07:19:53 crc kubenswrapper[4738]: I0307 07:19:53.037444 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" event={"ID":"21d7a3d8-a912-4470-93d7-9d12e38db487","Type":"ContainerStarted","Data":"a7ff612e681685ef00828811d27a63d52d4e2cbdd1c1286f8d4d0625d63f5c48"} Mar 07 07:19:53 crc kubenswrapper[4738]: I0307 07:19:53.070104 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" podStartSLOduration=2.0700756399999998 podStartE2EDuration="2.07007564s" podCreationTimestamp="2026-03-07 07:19:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:19:53.059865652 +0000 UTC m=+1211.524853003" watchObservedRunningTime="2026-03-07 07:19:53.07007564 +0000 UTC m=+1211.535062971" Mar 07 07:19:54 crc kubenswrapper[4738]: I0307 07:19:54.049208 4738 generic.go:334] "Generic (PLEG): container finished" podID="21d7a3d8-a912-4470-93d7-9d12e38db487" containerID="a1119f150a952b18a183cdcd8ca742866959d1872e944347aadf7a64dee76b9b" exitCode=0 Mar 07 07:19:54 crc kubenswrapper[4738]: I0307 07:19:54.049259 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" event={"ID":"21d7a3d8-a912-4470-93d7-9d12e38db487","Type":"ContainerDied","Data":"a1119f150a952b18a183cdcd8ca742866959d1872e944347aadf7a64dee76b9b"} Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.401137 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.459486 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.467998 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.533420 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-q9gvn"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.544895 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-q9gvn"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.551379 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552414 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="swift-recon-cron" containerID="cri-o://e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552658 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-updater" containerID="cri-o://bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552758 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="rsync" containerID="cri-o://6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552795 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-expirer" containerID="cri-o://447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552828 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-updater" containerID="cri-o://52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552860 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-auditor" containerID="cri-o://ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552889 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-replicator" containerID="cri-o://3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552917 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-server" containerID="cri-o://56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552947 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-server" containerID="cri-o://e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552978 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-auditor" containerID="cri-o://526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.553006 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-replicator" containerID="cri-o://50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.553037 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-auditor" containerID="cri-o://32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.553068 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-replicator" containerID="cri-o://bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.553105 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-reaper" containerID="cri-o://74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.552627 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-1" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-server" containerID="cri-o://b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.558093 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.558759 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-server" containerID="cri-o://2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.558887 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="swift-recon-cron" containerID="cri-o://3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.558949 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="rsync" containerID="cri-o://8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.558992 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-expirer" containerID="cri-o://39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559034 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-updater" containerID="cri-o://01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559083 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-auditor" containerID="cri-o://40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559128 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-replicator" containerID="cri-o://f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559217 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-server" containerID="cri-o://c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559271 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-updater" containerID="cri-o://cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559318 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-auditor" containerID="cri-o://d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559373 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-replicator" containerID="cri-o://0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559435 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-auditor" containerID="cri-o://0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559481 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-replicator" containerID="cri-o://be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559545 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-server" containerID="cri-o://53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.559593 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-2" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-reaper" containerID="cri-o://c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.568477 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.568907 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-server" containerID="cri-o://c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.568942 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="swift-recon-cron" containerID="cri-o://3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569018 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="rsync" containerID="cri-o://4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569020 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-updater" containerID="cri-o://023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569060 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-auditor" containerID="cri-o://268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569065 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-expirer" containerID="cri-o://90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569091 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-replicator" containerID="cri-o://fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569103 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-updater" containerID="cri-o://b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569119 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-server" containerID="cri-o://634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569140 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-auditor" containerID="cri-o://0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569204 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-replicator" containerID="cri-o://137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569293 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-auditor" containerID="cri-o://a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569345 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-replicator" containerID="cri-o://7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.568920 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-server" containerID="cri-o://5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.569148 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-reaper" containerID="cri-o://5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.606844 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.606905 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.606964 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.607042 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhswd\" (UniqueName: \"kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.607078 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.607154 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf\") pod \"21d7a3d8-a912-4470-93d7-9d12e38db487\" (UID: \"21d7a3d8-a912-4470-93d7-9d12e38db487\") " Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.608885 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.610538 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.613648 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.613860 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-httpd" containerID="cri-o://50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.614250 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-server" containerID="cri-o://5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369" gracePeriod=30 Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.647396 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd" (OuterVolumeSpecName: "kube-api-access-bhswd") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "kube-api-access-bhswd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.697814 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.697779 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts" (OuterVolumeSpecName: "scripts") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.709282 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "21d7a3d8-a912-4470-93d7-9d12e38db487" (UID: "21d7a3d8-a912-4470-93d7-9d12e38db487"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710450 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710490 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/21d7a3d8-a912-4470-93d7-9d12e38db487-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710501 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhswd\" (UniqueName: \"kubernetes.io/projected/21d7a3d8-a912-4470-93d7-9d12e38db487-kube-api-access-bhswd\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710510 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710518 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/21d7a3d8-a912-4470-93d7-9d12e38db487-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:55 crc kubenswrapper[4738]: I0307 07:19:55.710526 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/21d7a3d8-a912-4470-93d7-9d12e38db487-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074743 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074803 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074813 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074804 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074901 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074915 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074824 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074948 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074968 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074979 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074987 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.074996 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075006 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075019 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075030 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075040 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075030 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075079 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075092 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075103 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075116 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075131 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075143 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075196 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.075211 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082260 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082305 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082313 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082321 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082329 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082338 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082329 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082382 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082410 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082422 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082432 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082442 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082452 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082346 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082476 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082497 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082506 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082515 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082524 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082532 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082540 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082587 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082597 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082618 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082627 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082637 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.082646 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.084561 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xzd8n" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.084672 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7ff612e681685ef00828811d27a63d52d4e2cbdd1c1286f8d4d0625d63f5c48" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107002 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107042 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107059 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107066 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107073 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107080 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107090 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107097 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107104 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107111 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107117 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107124 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107209 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107241 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107253 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107281 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107291 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107300 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107315 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107327 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107337 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107348 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107401 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.107415 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.114779 4738 generic.go:334] "Generic (PLEG): container finished" podID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerID="50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d" exitCode=0 Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.114828 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerDied","Data":"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d"} Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.400365 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="022c5cfe-d18f-420f-8f7d-ac95c111dc54" path="/var/lib/kubelet/pods/022c5cfe-d18f-420f-8f7d-ac95c111dc54/volumes" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.401716 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21d7a3d8-a912-4470-93d7-9d12e38db487" path="/var/lib/kubelet/pods/21d7a3d8-a912-4470-93d7-9d12e38db487/volumes" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.572459 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.732534 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb4fk\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk\") pod \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.732599 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd\") pod \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.732627 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd\") pod \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.732716 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") pod \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.732757 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data\") pod \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\" (UID: \"d469c92b-a4c2-488c-b38d-fbc142aaa0f0\") " Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.733355 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d469c92b-a4c2-488c-b38d-fbc142aaa0f0" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.733427 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d469c92b-a4c2-488c-b38d-fbc142aaa0f0" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.739080 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d469c92b-a4c2-488c-b38d-fbc142aaa0f0" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.739838 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk" (OuterVolumeSpecName: "kube-api-access-jb4fk") pod "d469c92b-a4c2-488c-b38d-fbc142aaa0f0" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0"). InnerVolumeSpecName "kube-api-access-jb4fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.779843 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data" (OuterVolumeSpecName: "config-data") pod "d469c92b-a4c2-488c-b38d-fbc142aaa0f0" (UID: "d469c92b-a4c2-488c-b38d-fbc142aaa0f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.834590 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb4fk\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-kube-api-access-jb4fk\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.834626 4738 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.834636 4738 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.834645 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:56 crc kubenswrapper[4738]: I0307 07:19:56.834655 4738 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d469c92b-a4c2-488c-b38d-fbc142aaa0f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.133425 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333" exitCode=0 Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.134081 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c" exitCode=0 Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.133528 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333"} Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.134322 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c"} Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.137515 4738 generic.go:334] "Generic (PLEG): container finished" podID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerID="5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369" exitCode=0 Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.137615 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.137615 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerDied","Data":"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369"} Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.137753 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rsw5w" event={"ID":"d469c92b-a4c2-488c-b38d-fbc142aaa0f0","Type":"ContainerDied","Data":"854556c446e68d0325a0740f1187e6ec503bf1a7f9c40fa5402b2e122a9b54ba"} Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.137804 4738 scope.go:117] "RemoveContainer" containerID="5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.149667 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8" exitCode=0 Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.149742 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8"} Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.173180 4738 scope.go:117] "RemoveContainer" containerID="50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.185007 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.191267 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rsw5w"] Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.205906 4738 scope.go:117] "RemoveContainer" containerID="5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369" Mar 07 07:19:57 crc kubenswrapper[4738]: E0307 07:19:57.206554 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369\": container with ID starting with 5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369 not found: ID does not exist" containerID="5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.206618 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369"} err="failed to get container status \"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369\": rpc error: code = NotFound desc = could not find container \"5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369\": container with ID starting with 5a6d3189cedcc750eda7e33f741ffda29b03f79af55f5972991d95dc96bd5369 not found: ID does not exist" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.206650 4738 scope.go:117] "RemoveContainer" containerID="50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d" Mar 07 07:19:57 crc kubenswrapper[4738]: E0307 07:19:57.207265 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d\": container with ID starting with 50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d not found: ID does not exist" containerID="50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d" Mar 07 07:19:57 crc kubenswrapper[4738]: I0307 07:19:57.207347 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d"} err="failed to get container status \"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d\": rpc error: code = NotFound desc = could not find container \"50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d\": container with ID starting with 50cb6e29d71a2457f9a390a1f4d77a5922d8ae1fc02ffa5c815026823eb0ca0d not found: ID does not exist" Mar 07 07:19:58 crc kubenswrapper[4738]: I0307 07:19:58.399666 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" path="/var/lib/kubelet/pods/d469c92b-a4c2-488c-b38d-fbc142aaa0f0/volumes" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.134035 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547800-92slb"] Mar 07 07:20:00 crc kubenswrapper[4738]: E0307 07:20:00.135106 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21d7a3d8-a912-4470-93d7-9d12e38db487" containerName="swift-ring-rebalance" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135127 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="21d7a3d8-a912-4470-93d7-9d12e38db487" containerName="swift-ring-rebalance" Mar 07 07:20:00 crc kubenswrapper[4738]: E0307 07:20:00.135179 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-httpd" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135191 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-httpd" Mar 07 07:20:00 crc kubenswrapper[4738]: E0307 07:20:00.135203 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-server" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135212 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-server" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135411 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-server" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135431 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="21d7a3d8-a912-4470-93d7-9d12e38db487" containerName="swift-ring-rebalance" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.135457 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d469c92b-a4c2-488c-b38d-fbc142aaa0f0" containerName="proxy-httpd" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.136244 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.139920 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.140134 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.139981 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.154764 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-92slb"] Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.296113 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz9r7\" (UniqueName: \"kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7\") pod \"auto-csr-approver-29547800-92slb\" (UID: \"6c463008-ff10-4f2f-9d74-1b78a30c5720\") " pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.397623 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz9r7\" (UniqueName: \"kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7\") pod \"auto-csr-approver-29547800-92slb\" (UID: \"6c463008-ff10-4f2f-9d74-1b78a30c5720\") " pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.436837 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz9r7\" (UniqueName: \"kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7\") pod \"auto-csr-approver-29547800-92slb\" (UID: \"6c463008-ff10-4f2f-9d74-1b78a30c5720\") " pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.470026 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:00 crc kubenswrapper[4738]: I0307 07:20:00.740784 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-92slb"] Mar 07 07:20:01 crc kubenswrapper[4738]: I0307 07:20:01.188559 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-92slb" event={"ID":"6c463008-ff10-4f2f-9d74-1b78a30c5720","Type":"ContainerStarted","Data":"c71b7ff05d9c69561a8e0ab655858608c0044d0eb035a5845b17ac95079de6df"} Mar 07 07:20:02 crc kubenswrapper[4738]: I0307 07:20:02.204146 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-92slb" event={"ID":"6c463008-ff10-4f2f-9d74-1b78a30c5720","Type":"ContainerStarted","Data":"df5138da1b2a5108d9af7e375900102cbed7c5796f953e33700da6d1b19fd24d"} Mar 07 07:20:02 crc kubenswrapper[4738]: I0307 07:20:02.228675 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547800-92slb" podStartSLOduration=1.133660259 podStartE2EDuration="2.228649635s" podCreationTimestamp="2026-03-07 07:20:00 +0000 UTC" firstStartedPulling="2026-03-07 07:20:00.758662448 +0000 UTC m=+1219.223649759" lastFinishedPulling="2026-03-07 07:20:01.853651774 +0000 UTC m=+1220.318639135" observedRunningTime="2026-03-07 07:20:02.222471326 +0000 UTC m=+1220.687458647" watchObservedRunningTime="2026-03-07 07:20:02.228649635 +0000 UTC m=+1220.693636966" Mar 07 07:20:03 crc kubenswrapper[4738]: I0307 07:20:03.224380 4738 generic.go:334] "Generic (PLEG): container finished" podID="6c463008-ff10-4f2f-9d74-1b78a30c5720" containerID="df5138da1b2a5108d9af7e375900102cbed7c5796f953e33700da6d1b19fd24d" exitCode=0 Mar 07 07:20:03 crc kubenswrapper[4738]: I0307 07:20:03.224468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-92slb" event={"ID":"6c463008-ff10-4f2f-9d74-1b78a30c5720","Type":"ContainerDied","Data":"df5138da1b2a5108d9af7e375900102cbed7c5796f953e33700da6d1b19fd24d"} Mar 07 07:20:04 crc kubenswrapper[4738]: I0307 07:20:04.612307 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:04 crc kubenswrapper[4738]: I0307 07:20:04.692138 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz9r7\" (UniqueName: \"kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7\") pod \"6c463008-ff10-4f2f-9d74-1b78a30c5720\" (UID: \"6c463008-ff10-4f2f-9d74-1b78a30c5720\") " Mar 07 07:20:04 crc kubenswrapper[4738]: I0307 07:20:04.698492 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7" (OuterVolumeSpecName: "kube-api-access-tz9r7") pod "6c463008-ff10-4f2f-9d74-1b78a30c5720" (UID: "6c463008-ff10-4f2f-9d74-1b78a30c5720"). InnerVolumeSpecName "kube-api-access-tz9r7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:04 crc kubenswrapper[4738]: I0307 07:20:04.794518 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz9r7\" (UniqueName: \"kubernetes.io/projected/6c463008-ff10-4f2f-9d74-1b78a30c5720-kube-api-access-tz9r7\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:05 crc kubenswrapper[4738]: I0307 07:20:05.248489 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547800-92slb" event={"ID":"6c463008-ff10-4f2f-9d74-1b78a30c5720","Type":"ContainerDied","Data":"c71b7ff05d9c69561a8e0ab655858608c0044d0eb035a5845b17ac95079de6df"} Mar 07 07:20:05 crc kubenswrapper[4738]: I0307 07:20:05.248956 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c71b7ff05d9c69561a8e0ab655858608c0044d0eb035a5845b17ac95079de6df" Mar 07 07:20:05 crc kubenswrapper[4738]: I0307 07:20:05.248605 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547800-92slb" Mar 07 07:20:05 crc kubenswrapper[4738]: I0307 07:20:05.303477 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-gr7mp"] Mar 07 07:20:05 crc kubenswrapper[4738]: I0307 07:20:05.311857 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547794-gr7mp"] Mar 07 07:20:06 crc kubenswrapper[4738]: I0307 07:20:06.402450 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37661a52-49a3-4a00-abde-d7a211efde35" path="/var/lib/kubelet/pods/37661a52-49a3-4a00-abde-d7a211efde35/volumes" Mar 07 07:20:25 crc kubenswrapper[4738]: I0307 07:20:25.978924 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.072063 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.072813 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"0a930684-b4c9-4238-a12d-01befcc25561\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.072889 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift\") pod \"0a930684-b4c9-4238-a12d-01befcc25561\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.072923 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m69bs\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs\") pod \"0a930684-b4c9-4238-a12d-01befcc25561\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.073004 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock\") pod \"0a930684-b4c9-4238-a12d-01befcc25561\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.073043 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache\") pod \"0a930684-b4c9-4238-a12d-01befcc25561\" (UID: \"0a930684-b4c9-4238-a12d-01befcc25561\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.073677 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache" (OuterVolumeSpecName: "cache") pod "0a930684-b4c9-4238-a12d-01befcc25561" (UID: "0a930684-b4c9-4238-a12d-01befcc25561"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.074044 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock" (OuterVolumeSpecName: "lock") pod "0a930684-b4c9-4238-a12d-01befcc25561" (UID: "0a930684-b4c9-4238-a12d-01befcc25561"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.075490 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.078521 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0a930684-b4c9-4238-a12d-01befcc25561" (UID: "0a930684-b4c9-4238-a12d-01befcc25561"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.078559 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs" (OuterVolumeSpecName: "kube-api-access-m69bs") pod "0a930684-b4c9-4238-a12d-01befcc25561" (UID: "0a930684-b4c9-4238-a12d-01befcc25561"). InnerVolumeSpecName "kube-api-access-m69bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.078883 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "0a930684-b4c9-4238-a12d-01befcc25561" (UID: "0a930684-b4c9-4238-a12d-01befcc25561"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.174791 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q7zmv\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv\") pod \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.174853 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock\") pod \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.174917 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkhjr\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr\") pod \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.174974 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock\") pod \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.174992 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175021 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift\") pod \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175067 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache\") pod \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175111 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache\") pod \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175134 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\" (UID: \"42cd11f3-6765-4609-9fc2-c1f0fb2981bf\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175169 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") pod \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\" (UID: \"96ceb9b4-e0fc-488b-8ef4-e7092268ee23\") " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175331 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock" (OuterVolumeSpecName: "lock") pod "42cd11f3-6765-4609-9fc2-c1f0fb2981bf" (UID: "42cd11f3-6765-4609-9fc2-c1f0fb2981bf"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175705 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock" (OuterVolumeSpecName: "lock") pod "96ceb9b4-e0fc-488b-8ef4-e7092268ee23" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175721 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache" (OuterVolumeSpecName: "cache") pod "42cd11f3-6765-4609-9fc2-c1f0fb2981bf" (UID: "42cd11f3-6765-4609-9fc2-c1f0fb2981bf"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175953 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175973 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m69bs\" (UniqueName: \"kubernetes.io/projected/0a930684-b4c9-4238-a12d-01befcc25561-kube-api-access-m69bs\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175986 4738 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175994 4738 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.176003 4738 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/0a930684-b4c9-4238-a12d-01befcc25561-cache\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.176011 4738 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.176030 4738 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.176038 4738 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-cache\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.175954 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache" (OuterVolumeSpecName: "cache") pod "96ceb9b4-e0fc-488b-8ef4-e7092268ee23" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.178388 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr" (OuterVolumeSpecName: "kube-api-access-gkhjr") pod "42cd11f3-6765-4609-9fc2-c1f0fb2981bf" (UID: "42cd11f3-6765-4609-9fc2-c1f0fb2981bf"). InnerVolumeSpecName "kube-api-access-gkhjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.178429 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "96ceb9b4-e0fc-488b-8ef4-e7092268ee23" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.178825 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "96ceb9b4-e0fc-488b-8ef4-e7092268ee23" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.179289 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "swift") pod "42cd11f3-6765-4609-9fc2-c1f0fb2981bf" (UID: "42cd11f3-6765-4609-9fc2-c1f0fb2981bf"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.180352 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv" (OuterVolumeSpecName: "kube-api-access-q7zmv") pod "96ceb9b4-e0fc-488b-8ef4-e7092268ee23" (UID: "96ceb9b4-e0fc-488b-8ef4-e7092268ee23"). InnerVolumeSpecName "kube-api-access-q7zmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.180461 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "42cd11f3-6765-4609-9fc2-c1f0fb2981bf" (UID: "42cd11f3-6765-4609-9fc2-c1f0fb2981bf"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.191938 4738 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.277966 4738 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-cache\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278058 4738 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278080 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278099 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q7zmv\" (UniqueName: \"kubernetes.io/projected/96ceb9b4-e0fc-488b-8ef4-e7092268ee23-kube-api-access-q7zmv\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278120 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gkhjr\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-kube-api-access-gkhjr\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278146 4738 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278860 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/42cd11f3-6765-4609-9fc2-c1f0fb2981bf-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.278911 4738 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.303181 4738 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.308323 4738 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.381046 4738 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.381093 4738 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.481145 4738 generic.go:334] "Generic (PLEG): container finished" podID="0a930684-b4c9-4238-a12d-01befcc25561" containerID="e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737" exitCode=137 Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.481286 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.481387 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.481406 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"0a930684-b4c9-4238-a12d-01befcc25561","Type":"ContainerDied","Data":"0731eabda154de1456330e6822fe2448acf6b44750560d7f60f385949f040298"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.481447 4738 scope.go:117] "RemoveContainer" containerID="e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512474 4738 generic.go:334] "Generic (PLEG): container finished" podID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerID="3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9" exitCode=137 Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512673 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512752 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"42cd11f3-6765-4609-9fc2-c1f0fb2981bf","Type":"ContainerDied","Data":"3414f1f3c96226ea3a3659df89d64de3ccb472634fc4c0164e01395a152d5c9c"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512777 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512795 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512842 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512855 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512866 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512876 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512919 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512932 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.512943 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.514900 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.517466 4738 scope.go:117] "RemoveContainer" containerID="6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.520349 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523869 4738 generic.go:334] "Generic (PLEG): container finished" podID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerID="3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287" exitCode=137 Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523919 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523947 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523962 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523971 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523978 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523985 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523992 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.523999 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524006 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524013 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524020 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524026 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524034 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524041 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524050 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524074 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"96ceb9b4-e0fc-488b-8ef4-e7092268ee23","Type":"ContainerDied","Data":"e4672af42ab4a31dc2864a5375b24527b0d4847a8526040e4c1e994f0b584462"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524086 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524094 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524101 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524107 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524114 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524120 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524127 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524134 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524140 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524147 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524153 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524180 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524187 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524193 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524200 4738 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c"} Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.524047 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.539138 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.544844 4738 scope.go:117] "RemoveContainer" containerID="447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.566469 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.575904 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.582819 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.590928 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.593811 4738 scope.go:117] "RemoveContainer" containerID="52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.611123 4738 scope.go:117] "RemoveContainer" containerID="ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.635857 4738 scope.go:117] "RemoveContainer" containerID="3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.664362 4738 scope.go:117] "RemoveContainer" containerID="56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.714390 4738 scope.go:117] "RemoveContainer" containerID="bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.733343 4738 scope.go:117] "RemoveContainer" containerID="526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.750500 4738 scope.go:117] "RemoveContainer" containerID="50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.770460 4738 scope.go:117] "RemoveContainer" containerID="e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.791909 4738 scope.go:117] "RemoveContainer" containerID="74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.810782 4738 scope.go:117] "RemoveContainer" containerID="32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.834724 4738 scope.go:117] "RemoveContainer" containerID="bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.852880 4738 scope.go:117] "RemoveContainer" containerID="b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.873014 4738 scope.go:117] "RemoveContainer" containerID="e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.873679 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737\": container with ID starting with e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737 not found: ID does not exist" containerID="e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.873728 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737"} err="failed to get container status \"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737\": rpc error: code = NotFound desc = could not find container \"e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737\": container with ID starting with e4c18fbcf45c419a845fca6021065e426ac363e04fbaedafa8d5996752493737 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.873761 4738 scope.go:117] "RemoveContainer" containerID="6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.874046 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391\": container with ID starting with 6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391 not found: ID does not exist" containerID="6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874090 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391"} err="failed to get container status \"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391\": rpc error: code = NotFound desc = could not find container \"6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391\": container with ID starting with 6fc2d79e6ed76a2aa3902d1f04776c46bc6e6c351eb75738758e7d758faea391 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874118 4738 scope.go:117] "RemoveContainer" containerID="447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.874395 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb\": container with ID starting with 447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb not found: ID does not exist" containerID="447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874431 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb"} err="failed to get container status \"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb\": rpc error: code = NotFound desc = could not find container \"447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb\": container with ID starting with 447e67220d4537b714a6ce7a1b0dc1da7d80f2bb3a44e88e3217ceea7c2cc9fb not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874452 4738 scope.go:117] "RemoveContainer" containerID="52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.874708 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f\": container with ID starting with 52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f not found: ID does not exist" containerID="52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874739 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f"} err="failed to get container status \"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f\": rpc error: code = NotFound desc = could not find container \"52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f\": container with ID starting with 52a417a8eb045429236afe6ab639ca8c67812669b31cd87f19e7fedeb5dbbb8f not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.874762 4738 scope.go:117] "RemoveContainer" containerID="ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.874980 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb\": container with ID starting with ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb not found: ID does not exist" containerID="ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875012 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb"} err="failed to get container status \"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb\": rpc error: code = NotFound desc = could not find container \"ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb\": container with ID starting with ff3f41c50d65750c83aa588726223a30040cd27c4fe4b1cd143575c10ea82fbb not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875030 4738 scope.go:117] "RemoveContainer" containerID="3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.875254 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803\": container with ID starting with 3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803 not found: ID does not exist" containerID="3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875290 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803"} err="failed to get container status \"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803\": rpc error: code = NotFound desc = could not find container \"3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803\": container with ID starting with 3d2c3ed7df67a6ff4073f7d5d24e40afac401c61651cdd6cc0c5759241ca9803 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875313 4738 scope.go:117] "RemoveContainer" containerID="56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.875518 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00\": container with ID starting with 56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00 not found: ID does not exist" containerID="56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875546 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00"} err="failed to get container status \"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00\": rpc error: code = NotFound desc = could not find container \"56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00\": container with ID starting with 56815f9730bc75bdcd8f18fc8d3075fc23567b21f96c07f6bab481f604850c00 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875566 4738 scope.go:117] "RemoveContainer" containerID="bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.875759 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709\": container with ID starting with bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709 not found: ID does not exist" containerID="bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875786 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709"} err="failed to get container status \"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709\": rpc error: code = NotFound desc = could not find container \"bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709\": container with ID starting with bf95213389fa376cb539bf6d8ad3ac8578576ffa41752dec1d469f2c9f7d6709 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.875802 4738 scope.go:117] "RemoveContainer" containerID="526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.876029 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6\": container with ID starting with 526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6 not found: ID does not exist" containerID="526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876055 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6"} err="failed to get container status \"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6\": rpc error: code = NotFound desc = could not find container \"526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6\": container with ID starting with 526da633fdfa13ab59af0be631f0a57382d2c13361cb1aa538f116b31206c3a6 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876073 4738 scope.go:117] "RemoveContainer" containerID="50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.876311 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287\": container with ID starting with 50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287 not found: ID does not exist" containerID="50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876334 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287"} err="failed to get container status \"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287\": rpc error: code = NotFound desc = could not find container \"50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287\": container with ID starting with 50d54dad2c11e1433417c1d5eacc2525ef13ccf32383fac908e7da619461d287 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876352 4738 scope.go:117] "RemoveContainer" containerID="e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.876547 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de\": container with ID starting with e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de not found: ID does not exist" containerID="e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876571 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de"} err="failed to get container status \"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de\": rpc error: code = NotFound desc = could not find container \"e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de\": container with ID starting with e8f567722e26c5855f8a335226bfbd216ea673204774b39a300d238c1101b3de not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876587 4738 scope.go:117] "RemoveContainer" containerID="74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.876783 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6\": container with ID starting with 74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6 not found: ID does not exist" containerID="74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876809 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6"} err="failed to get container status \"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6\": rpc error: code = NotFound desc = could not find container \"74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6\": container with ID starting with 74c28c6df141378f88f7488ff6bfdacddb398f412f23b920bdc8e2344aa82df6 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.876826 4738 scope.go:117] "RemoveContainer" containerID="32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.877043 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4\": container with ID starting with 32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4 not found: ID does not exist" containerID="32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877071 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4"} err="failed to get container status \"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4\": rpc error: code = NotFound desc = could not find container \"32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4\": container with ID starting with 32adec2ee3d7eb5e9f9c524bd23ecaedb1d049e703cc9542af2e891a02130be4 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877088 4738 scope.go:117] "RemoveContainer" containerID="bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.877338 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718\": container with ID starting with bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718 not found: ID does not exist" containerID="bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877364 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718"} err="failed to get container status \"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718\": rpc error: code = NotFound desc = could not find container \"bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718\": container with ID starting with bf7ea1f65a1c10b00d492aa501355337b6dba851f133cc9e81fc1dbcb1865718 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877380 4738 scope.go:117] "RemoveContainer" containerID="b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8" Mar 07 07:20:26 crc kubenswrapper[4738]: E0307 07:20:26.877627 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8\": container with ID starting with b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8 not found: ID does not exist" containerID="b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877652 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8"} err="failed to get container status \"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8\": rpc error: code = NotFound desc = could not find container \"b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8\": container with ID starting with b81632c1938ae5c7f37a56623bb7e08e9c9b83799f684481b5b287988a038ff8 not found: ID does not exist" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.877667 4738 scope.go:117] "RemoveContainer" containerID="3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.897358 4738 scope.go:117] "RemoveContainer" containerID="8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.914193 4738 scope.go:117] "RemoveContainer" containerID="39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.930798 4738 scope.go:117] "RemoveContainer" containerID="01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.948561 4738 scope.go:117] "RemoveContainer" containerID="40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.963445 4738 scope.go:117] "RemoveContainer" containerID="f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8" Mar 07 07:20:26 crc kubenswrapper[4738]: I0307 07:20:26.979593 4738 scope.go:117] "RemoveContainer" containerID="c9accd7d56723a30b1e064c6db482dfe3c0c8c25a3362cb503fc9e1740a21d81" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.002682 4738 scope.go:117] "RemoveContainer" containerID="cbcd5ef3a6beceade6a10aeef664573859ddea9850fed0f9280c7405361c726f" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.018074 4738 scope.go:117] "RemoveContainer" containerID="d87d9a39b92a07b8170662ba15ed360e84ba521d89ecbb449261b231eb5f9b80" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.039031 4738 scope.go:117] "RemoveContainer" containerID="0c1b0371d893f6e0390810a8aab913c1d9657f66534ae42885c70690589bf9e9" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.063649 4738 scope.go:117] "RemoveContainer" containerID="53b976cf051f11cb8f660e774114d0ee41f535e9dedd4f043ca21c490af87734" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.081358 4738 scope.go:117] "RemoveContainer" containerID="c5c5b4afaf52f1ecfc98c8dcfbd532bf19abe26898fc6b73352a7c16747d6d59" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.095482 4738 scope.go:117] "RemoveContainer" containerID="0d5b3c74b209817952ccb739c381b7b9d12d42e85d7e4044ba3391791272d781" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.112255 4738 scope.go:117] "RemoveContainer" containerID="be3d8c83518f0f3933501a1d36cf0fbd8bfd4dd31113abf7e182f8d39be340cb" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.129192 4738 scope.go:117] "RemoveContainer" containerID="2fff460fa23e4d25e3da3e96d8425d2eb15036c2351751c1a3aebcd697cfb1d9" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.143743 4738 scope.go:117] "RemoveContainer" containerID="3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.144345 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9\": container with ID starting with 3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9 not found: ID does not exist" containerID="3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.144392 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9"} err="failed to get container status \"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9\": rpc error: code = NotFound desc = could not find container \"3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9\": container with ID starting with 3c4cc8214b36420635ba039d3c00c6948d58d957980e0d1c9562383bfca77fc9 not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.144422 4738 scope.go:117] "RemoveContainer" containerID="8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.144759 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14\": container with ID starting with 8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14 not found: ID does not exist" containerID="8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.144807 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14"} err="failed to get container status \"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14\": rpc error: code = NotFound desc = could not find container \"8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14\": container with ID starting with 8c2fe218c6e02f331d33d7063e8482d45cd9f32b3789ca9c49d926c67499fc14 not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.144839 4738 scope.go:117] "RemoveContainer" containerID="39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.145201 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1\": container with ID starting with 39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1 not found: ID does not exist" containerID="39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.145254 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1"} err="failed to get container status \"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1\": rpc error: code = NotFound desc = could not find container \"39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1\": container with ID starting with 39b4e0f1e6d7033b1684d6624ba3e2c94b2b55568b02cdcebd3bae46f528dfc1 not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.145285 4738 scope.go:117] "RemoveContainer" containerID="01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.145572 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a\": container with ID starting with 01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a not found: ID does not exist" containerID="01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.145598 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a"} err="failed to get container status \"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a\": rpc error: code = NotFound desc = could not find container \"01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a\": container with ID starting with 01acc461f0e0e846b5a4e7ada9595d0b8c4c9683efed29b5353bf8ef995cf52a not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.145613 4738 scope.go:117] "RemoveContainer" containerID="40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.146106 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab\": container with ID starting with 40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab not found: ID does not exist" containerID="40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.146170 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab"} err="failed to get container status \"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab\": rpc error: code = NotFound desc = could not find container \"40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab\": container with ID starting with 40f811bec3d868d647e181fbd2ff35f48f2b70d83c3724ab4152f5aaefd0edab not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.146196 4738 scope.go:117] "RemoveContainer" containerID="f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8" Mar 07 07:20:27 crc kubenswrapper[4738]: E0307 07:20:27.146509 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8\": container with ID starting with f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8 not found: ID does not exist" containerID="f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.146550 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8"} err="failed to get container status \"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8\": rpc error: code = NotFound desc = could not find container \"f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8\": container with ID starting with f6afa155d642e49003483537eaf5f80def6b71bbd9ccee398e1d91a13ac268a8 not found: ID does not exist" Mar 07 07:20:27 crc kubenswrapper[4738]: I0307 07:20:27.146574 4738 scope.go:117] "RemoveContainer" containerID="3da28d20ae0901df855c4510805efbf79e985addd356e9506c54db5f6d027287" Mar 07 07:20:28 crc kubenswrapper[4738]: I0307 07:20:28.402860 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0a930684-b4c9-4238-a12d-01befcc25561" path="/var/lib/kubelet/pods/0a930684-b4c9-4238-a12d-01befcc25561/volumes" Mar 07 07:20:28 crc kubenswrapper[4738]: I0307 07:20:28.406968 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" path="/var/lib/kubelet/pods/42cd11f3-6765-4609-9fc2-c1f0fb2981bf/volumes" Mar 07 07:20:28 crc kubenswrapper[4738]: I0307 07:20:28.410965 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" path="/var/lib/kubelet/pods/96ceb9b4-e0fc-488b-8ef4-e7092268ee23/volumes" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662302 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662859 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662873 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662887 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662895 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662906 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662914 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662924 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662934 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662947 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.662957 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.662995 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663004 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663015 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663022 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663036 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663044 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663053 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663061 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663072 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663080 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663090 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663098 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663111 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663118 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663130 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663137 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663150 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663183 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663192 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663199 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663212 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663220 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663232 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663239 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663249 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663257 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663266 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663274 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663284 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663291 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663300 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663308 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663321 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663329 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663338 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663346 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663356 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663364 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663374 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663382 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663389 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663397 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663410 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663417 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663426 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663433 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663441 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c463008-ff10-4f2f-9d74-1b78a30c5720" containerName="oc" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663449 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c463008-ff10-4f2f-9d74-1b78a30c5720" containerName="oc" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663460 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663467 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663480 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663488 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663496 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663503 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663516 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663523 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663538 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663546 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663558 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663566 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663574 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663581 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663591 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663599 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663609 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663616 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663629 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663636 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663646 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663653 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663665 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663672 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663682 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663690 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663703 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663711 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663723 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663731 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663744 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663751 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.663761 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663768 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663908 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663921 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663931 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663941 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663954 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c463008-ff10-4f2f-9d74-1b78a30c5720" containerName="oc" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663963 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663975 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663984 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.663997 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664007 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664015 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664024 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664036 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664047 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664060 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664068 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-reaper" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664078 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664088 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664096 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664104 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664114 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664124 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664134 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664147 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664174 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664186 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664194 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664205 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664217 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664226 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664240 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-expirer" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664250 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="container-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664262 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664274 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664286 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664300 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664308 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="account-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664319 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="swift-recon-cron" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664328 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664337 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="container-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664347 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="object-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664357 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96ceb9b4-e0fc-488b-8ef4-e7092268ee23" containerName="account-auditor" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664367 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="rsync" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664375 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="container-replicator" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664386 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="42cd11f3-6765-4609-9fc2-c1f0fb2981bf" containerName="object-server" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.664397 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0a930684-b4c9-4238-a12d-01befcc25561" containerName="object-updater" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.669313 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.671801 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-conf" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.671981 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-swift-dockercfg-wm2xz" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.673625 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-files" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.673693 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-storage-config-data" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.695774 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.733098 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.733148 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.733202 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.733262 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.733374 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc2mc\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.835304 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.835389 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.835506 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.835608 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.835638 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.835646 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc2mc\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: E0307 07:20:29.835699 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift podName:723fe892-48fa-4502-bd9f-2c07ed1c3dc7 nodeName:}" failed. No retries permitted until 2026-03-07 07:20:30.335677811 +0000 UTC m=+1248.800665142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift") pod "swift-storage-0" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7") : configmap "swift-ring-files" not found Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.835796 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.836382 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.836600 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.836608 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") device mount path \"/mnt/openstack/pv07\"" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.858764 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc2mc\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:29 crc kubenswrapper[4738]: I0307 07:20:29.875136 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:30 crc kubenswrapper[4738]: I0307 07:20:30.343676 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:30 crc kubenswrapper[4738]: E0307 07:20:30.344003 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:20:30 crc kubenswrapper[4738]: E0307 07:20:30.344033 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:20:30 crc kubenswrapper[4738]: E0307 07:20:30.344102 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift podName:723fe892-48fa-4502-bd9f-2c07ed1c3dc7 nodeName:}" failed. No retries permitted until 2026-03-07 07:20:31.344080913 +0000 UTC m=+1249.809068264 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift") pod "swift-storage-0" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7") : configmap "swift-ring-files" not found Mar 07 07:20:31 crc kubenswrapper[4738]: I0307 07:20:31.357903 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:31 crc kubenswrapper[4738]: E0307 07:20:31.358251 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:20:31 crc kubenswrapper[4738]: E0307 07:20:31.358298 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:20:31 crc kubenswrapper[4738]: E0307 07:20:31.358395 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift podName:723fe892-48fa-4502-bd9f-2c07ed1c3dc7 nodeName:}" failed. No retries permitted until 2026-03-07 07:20:33.358367687 +0000 UTC m=+1251.823355048 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift") pod "swift-storage-0" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7") : configmap "swift-ring-files" not found Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.390053 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:33 crc kubenswrapper[4738]: E0307 07:20:33.390260 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:20:33 crc kubenswrapper[4738]: E0307 07:20:33.390536 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:20:33 crc kubenswrapper[4738]: E0307 07:20:33.390620 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift podName:723fe892-48fa-4502-bd9f-2c07ed1c3dc7 nodeName:}" failed. No retries permitted until 2026-03-07 07:20:37.390595804 +0000 UTC m=+1255.855583155 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift") pod "swift-storage-0" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7") : configmap "swift-ring-files" not found Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.569499 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-ckq94"] Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.571005 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.574039 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.574932 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.575037 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.577462 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-ckq94"] Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.707866 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.707971 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.708102 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.708137 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.708286 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwhlh\" (UniqueName: \"kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.708309 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.809900 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.809959 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.810010 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwhlh\" (UniqueName: \"kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.810026 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.810069 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.810090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.810910 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.811329 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.811350 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.815090 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.815130 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.825915 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwhlh\" (UniqueName: \"kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh\") pod \"swift-ring-rebalance-ckq94\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:33 crc kubenswrapper[4738]: I0307 07:20:33.903930 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:34 crc kubenswrapper[4738]: I0307 07:20:34.148237 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-ckq94"] Mar 07 07:20:34 crc kubenswrapper[4738]: I0307 07:20:34.628850 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" event={"ID":"8ad8b667-e750-405f-9dc9-1e675ee21e58","Type":"ContainerStarted","Data":"72d5bd0ef9fcd9187362469aecc1431d05095186221ab138da11de71511415d2"} Mar 07 07:20:34 crc kubenswrapper[4738]: I0307 07:20:34.629313 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" event={"ID":"8ad8b667-e750-405f-9dc9-1e675ee21e58","Type":"ContainerStarted","Data":"781af740511c0489562b00a6e27690be3f2412831092bc66ceb31ee495cfe911"} Mar 07 07:20:37 crc kubenswrapper[4738]: I0307 07:20:37.470555 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:37 crc kubenswrapper[4738]: E0307 07:20:37.470716 4738 projected.go:288] Couldn't get configMap swift-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Mar 07 07:20:37 crc kubenswrapper[4738]: E0307 07:20:37.471423 4738 projected.go:194] Error preparing data for projected volume etc-swift for pod swift-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Mar 07 07:20:37 crc kubenswrapper[4738]: E0307 07:20:37.471484 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift podName:723fe892-48fa-4502-bd9f-2c07ed1c3dc7 nodeName:}" failed. No retries permitted until 2026-03-07 07:20:45.471461988 +0000 UTC m=+1263.936449319 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift") pod "swift-storage-0" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7") : configmap "swift-ring-files" not found Mar 07 07:20:40 crc kubenswrapper[4738]: I0307 07:20:40.701637 4738 generic.go:334] "Generic (PLEG): container finished" podID="8ad8b667-e750-405f-9dc9-1e675ee21e58" containerID="72d5bd0ef9fcd9187362469aecc1431d05095186221ab138da11de71511415d2" exitCode=0 Mar 07 07:20:40 crc kubenswrapper[4738]: I0307 07:20:40.701751 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" event={"ID":"8ad8b667-e750-405f-9dc9-1e675ee21e58","Type":"ContainerDied","Data":"72d5bd0ef9fcd9187362469aecc1431d05095186221ab138da11de71511415d2"} Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.027577 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142137 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142222 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142275 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142317 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwhlh\" (UniqueName: \"kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142428 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142461 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift\") pod \"8ad8b667-e750-405f-9dc9-1e675ee21e58\" (UID: \"8ad8b667-e750-405f-9dc9-1e675ee21e58\") " Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142641 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.142806 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.143610 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.151017 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh" (OuterVolumeSpecName: "kube-api-access-bwhlh") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "kube-api-access-bwhlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.152035 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.168507 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.173819 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts" (OuterVolumeSpecName: "scripts") pod "8ad8b667-e750-405f-9dc9-1e675ee21e58" (UID: "8ad8b667-e750-405f-9dc9-1e675ee21e58"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.244495 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8ad8b667-e750-405f-9dc9-1e675ee21e58-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.244523 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwhlh\" (UniqueName: \"kubernetes.io/projected/8ad8b667-e750-405f-9dc9-1e675ee21e58-kube-api-access-bwhlh\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.244534 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.244543 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8ad8b667-e750-405f-9dc9-1e675ee21e58-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.244552 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8ad8b667-e750-405f-9dc9-1e675ee21e58-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.716816 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" event={"ID":"8ad8b667-e750-405f-9dc9-1e675ee21e58","Type":"ContainerDied","Data":"781af740511c0489562b00a6e27690be3f2412831092bc66ceb31ee495cfe911"} Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.717129 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="781af740511c0489562b00a6e27690be3f2412831092bc66ceb31ee495cfe911" Mar 07 07:20:42 crc kubenswrapper[4738]: I0307 07:20:42.716860 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-ckq94" Mar 07 07:20:44 crc kubenswrapper[4738]: I0307 07:20:44.141396 4738 scope.go:117] "RemoveContainer" containerID="dc65af93403a30c37b0d69574145cdcaa325ab4b9badbb646cee1887ddcd1f69" Mar 07 07:20:45 crc kubenswrapper[4738]: I0307 07:20:45.500876 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:45 crc kubenswrapper[4738]: I0307 07:20:45.517448 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"swift-storage-0\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:45 crc kubenswrapper[4738]: I0307 07:20:45.594257 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:20:45 crc kubenswrapper[4738]: I0307 07:20:45.843141 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.755696 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"51c349bc5d55ee2d3fc56b86da54c7c34384bedf3f5fe2689bea28ad7921f033"} Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.755998 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"416c7fb6624c5ef2dd15d22a6f968a5dca6cebd03950721f6596ca9685d06444"} Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.756009 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"c3f19f9768659ab0515ccbf7de57cf1e3df945357ed0ef1fe9e486d2c5eb499c"} Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.756016 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"47c03f810320b9a33114fc3c3ba2781d7ab71962a5e28154a225cd2f574bff52"} Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.756024 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"913e842ec4b8fb2ceb7aecd6316998ff118c0efd11bb7125242adca66c43121f"} Mar 07 07:20:46 crc kubenswrapper[4738]: I0307 07:20:46.756032 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"ff6243cf5ab07b63292f7e6730ff539588914d49b60e4da27285db5f28becb16"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778161 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"5d071eeaee3064a50621d8de7dd7af7b823932293d510432e668f78c38c1c4e7"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"52cb8f78219255bab27dd27dfd8cc4e03cfb0253330ed63d3a97072e8745eb04"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778479 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"9bd4cf4c776e165070f9dc389014feffc9dbf95798d7b06bbaaee8c4e5512986"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778488 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"cbf1846742e98ca6b7886c4be2aef198fcd953fc98f719a09f117ff72ec44b3c"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778497 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"2e138c25ff9cc34c446cd31eed6f66670e0905685accd7512401f944467368d4"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778505 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"2656672c4b40e1b490b16dc369afe4001df59767d99c90164f2927301d09099e"} Mar 07 07:20:47 crc kubenswrapper[4738]: I0307 07:20:47.778514 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"92d682daa1c81453a10d48543264991f1c2ae8439fa5dbdcf45ea378dea1ca87"} Mar 07 07:20:48 crc kubenswrapper[4738]: I0307 07:20:48.797859 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"00a1114ced83cdd25422442b4fc0dbeaa45242e22daac40c0a8c217baf840470"} Mar 07 07:20:48 crc kubenswrapper[4738]: I0307 07:20:48.798249 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"e2afcb3d5bd6da436cd1d63b989f97736f8b78ab7c1d84e305e383973d5b5418"} Mar 07 07:20:48 crc kubenswrapper[4738]: I0307 07:20:48.798265 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"1e33676e1dfc72460ee74d4f28f334f29bb4af4e0ccf64a2a72424ee26b1be7e"} Mar 07 07:20:48 crc kubenswrapper[4738]: I0307 07:20:48.798278 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerStarted","Data":"1a160f2572469e6627e2860f3f697e4574196f2abbf061aa6cf9df71d1e4fac2"} Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.650393 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=26.650365171 podStartE2EDuration="26.650365171s" podCreationTimestamp="2026-03-07 07:20:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:20:48.856549301 +0000 UTC m=+1267.321536642" watchObservedRunningTime="2026-03-07 07:20:54.650365171 +0000 UTC m=+1273.115352542" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.660516 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:20:54 crc kubenswrapper[4738]: E0307 07:20:54.660839 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ad8b667-e750-405f-9dc9-1e675ee21e58" containerName="swift-ring-rebalance" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.660860 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ad8b667-e750-405f-9dc9-1e675ee21e58" containerName="swift-ring-rebalance" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.661083 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ad8b667-e750-405f-9dc9-1e675ee21e58" containerName="swift-ring-rebalance" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.662394 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.675303 4738 reflector.go:368] Caches populated for *v1.Secret from object-"swift-kuttl-tests"/"swift-proxy-config-data" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.681782 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.741911 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.742051 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.742103 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.742177 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.742209 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.844090 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.844244 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.844288 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.844365 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.844467 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.845388 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.845510 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.853375 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.860827 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.868507 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2\") pod \"swift-proxy-6ff54dd47f-45755\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:54 crc kubenswrapper[4738]: I0307 07:20:54.998593 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.465914 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:20:55 crc kubenswrapper[4738]: W0307 07:20:55.473851 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77a99f0c_5b1d_489e_8ccf_5daa0a8a20a6.slice/crio-0938d1bf50cf639582a3eab54a65c671da05634d9a6809d4bf096b7d2d359823 WatchSource:0}: Error finding container 0938d1bf50cf639582a3eab54a65c671da05634d9a6809d4bf096b7d2d359823: Status 404 returned error can't find the container with id 0938d1bf50cf639582a3eab54a65c671da05634d9a6809d4bf096b7d2d359823 Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.893531 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerStarted","Data":"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9"} Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.894141 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerStarted","Data":"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635"} Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.894186 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerStarted","Data":"0938d1bf50cf639582a3eab54a65c671da05634d9a6809d4bf096b7d2d359823"} Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.894244 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.894277 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:20:55 crc kubenswrapper[4738]: I0307 07:20:55.932542 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" podStartSLOduration=1.932509724 podStartE2EDuration="1.932509724s" podCreationTimestamp="2026-03-07 07:20:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:20:55.930008215 +0000 UTC m=+1274.394995556" watchObservedRunningTime="2026-03-07 07:20:55.932509724 +0000 UTC m=+1274.397497065" Mar 07 07:21:05 crc kubenswrapper[4738]: I0307 07:21:05.001991 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:21:05 crc kubenswrapper[4738]: I0307 07:21:05.003065 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.192635 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6shwc"] Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.194372 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.197554 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.198040 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.212265 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6shwc"] Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.357866 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdhm\" (UniqueName: \"kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.358225 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.358321 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.358435 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.358478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.358544 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.459568 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.459763 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.459846 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.459923 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.460039 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdhm\" (UniqueName: \"kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.460244 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.460585 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.461315 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.462275 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.472999 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.473051 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.490308 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdhm\" (UniqueName: \"kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm\") pod \"swift-ring-rebalance-debug-6shwc\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:07 crc kubenswrapper[4738]: I0307 07:21:07.517863 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:08 crc kubenswrapper[4738]: I0307 07:21:08.035600 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6shwc"] Mar 07 07:21:08 crc kubenswrapper[4738]: W0307 07:21:08.042627 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39a490c3_266b_4449_92e1_c91087e2680c.slice/crio-238c0d5d7a346000f5f426215094d08e0dbbbb604dd4853fc020916fb346fc7a WatchSource:0}: Error finding container 238c0d5d7a346000f5f426215094d08e0dbbbb604dd4853fc020916fb346fc7a: Status 404 returned error can't find the container with id 238c0d5d7a346000f5f426215094d08e0dbbbb604dd4853fc020916fb346fc7a Mar 07 07:21:09 crc kubenswrapper[4738]: I0307 07:21:09.024965 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" event={"ID":"39a490c3-266b-4449-92e1-c91087e2680c","Type":"ContainerStarted","Data":"411a532818975637abe917bde52cbf4f704e808b4c662524cf9287ac5c272f82"} Mar 07 07:21:09 crc kubenswrapper[4738]: I0307 07:21:09.025416 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" event={"ID":"39a490c3-266b-4449-92e1-c91087e2680c","Type":"ContainerStarted","Data":"238c0d5d7a346000f5f426215094d08e0dbbbb604dd4853fc020916fb346fc7a"} Mar 07 07:21:09 crc kubenswrapper[4738]: I0307 07:21:09.043546 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" podStartSLOduration=2.043519967 podStartE2EDuration="2.043519967s" podCreationTimestamp="2026-03-07 07:21:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:21:09.04031102 +0000 UTC m=+1287.505298361" watchObservedRunningTime="2026-03-07 07:21:09.043519967 +0000 UTC m=+1287.508507318" Mar 07 07:21:11 crc kubenswrapper[4738]: I0307 07:21:11.046698 4738 generic.go:334] "Generic (PLEG): container finished" podID="39a490c3-266b-4449-92e1-c91087e2680c" containerID="411a532818975637abe917bde52cbf4f704e808b4c662524cf9287ac5c272f82" exitCode=0 Mar 07 07:21:11 crc kubenswrapper[4738]: I0307 07:21:11.046749 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" event={"ID":"39a490c3-266b-4449-92e1-c91087e2680c","Type":"ContainerDied","Data":"411a532818975637abe917bde52cbf4f704e808b4c662524cf9287ac5c272f82"} Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.427097 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.483120 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6shwc"] Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.489035 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6shwc"] Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.578777 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.578844 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdhm\" (UniqueName: \"kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.578899 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.578961 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.579040 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.579066 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts\") pod \"39a490c3-266b-4449-92e1-c91087e2680c\" (UID: \"39a490c3-266b-4449-92e1-c91087e2680c\") " Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.580138 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.581238 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.587776 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm" (OuterVolumeSpecName: "kube-api-access-fhdhm") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "kube-api-access-fhdhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.607983 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts" (OuterVolumeSpecName: "scripts") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.610190 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.615861 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "39a490c3-266b-4449-92e1-c91087e2680c" (UID: "39a490c3-266b-4449-92e1-c91087e2680c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.643460 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n"] Mar 07 07:21:12 crc kubenswrapper[4738]: E0307 07:21:12.643992 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a490c3-266b-4449-92e1-c91087e2680c" containerName="swift-ring-rebalance" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.644024 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a490c3-266b-4449-92e1-c91087e2680c" containerName="swift-ring-rebalance" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.644429 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a490c3-266b-4449-92e1-c91087e2680c" containerName="swift-ring-rebalance" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.645628 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.655125 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n"] Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680877 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39a490c3-266b-4449-92e1-c91087e2680c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680909 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680921 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39a490c3-266b-4449-92e1-c91087e2680c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680935 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhdhm\" (UniqueName: \"kubernetes.io/projected/39a490c3-266b-4449-92e1-c91087e2680c-kube-api-access-fhdhm\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680947 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.680956 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39a490c3-266b-4449-92e1-c91087e2680c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.782502 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.782756 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.782807 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.782966 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbcbr\" (UniqueName: \"kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.783036 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.783202 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.884323 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.884501 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.884605 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.884652 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.884785 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbcbr\" (UniqueName: \"kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.885638 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.885860 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.886294 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.887028 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.889845 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.890463 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.913620 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbcbr\" (UniqueName: \"kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr\") pod \"swift-ring-rebalance-debug-vbv2n\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:12 crc kubenswrapper[4738]: I0307 07:21:12.979028 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:13 crc kubenswrapper[4738]: I0307 07:21:13.069604 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="238c0d5d7a346000f5f426215094d08e0dbbbb604dd4853fc020916fb346fc7a" Mar 07 07:21:13 crc kubenswrapper[4738]: I0307 07:21:13.069722 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6shwc" Mar 07 07:21:13 crc kubenswrapper[4738]: I0307 07:21:13.473111 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n"] Mar 07 07:21:14 crc kubenswrapper[4738]: I0307 07:21:14.083631 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" event={"ID":"647b453a-b6ac-4fec-b179-5d0a175bd7df","Type":"ContainerStarted","Data":"c5c192ceb06bd2715e13829524ff465021057a51efdc49629fc92642eac6464c"} Mar 07 07:21:14 crc kubenswrapper[4738]: I0307 07:21:14.083693 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" event={"ID":"647b453a-b6ac-4fec-b179-5d0a175bd7df","Type":"ContainerStarted","Data":"14c0c1c4b8fd9e13345a2fc1b292680f8627377aa7b9ca5a4b17fea0169e0cf8"} Mar 07 07:21:14 crc kubenswrapper[4738]: I0307 07:21:14.119355 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" podStartSLOduration=2.119310096 podStartE2EDuration="2.119310096s" podCreationTimestamp="2026-03-07 07:21:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:21:14.105665764 +0000 UTC m=+1292.570653125" watchObservedRunningTime="2026-03-07 07:21:14.119310096 +0000 UTC m=+1292.584297427" Mar 07 07:21:14 crc kubenswrapper[4738]: I0307 07:21:14.401381 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a490c3-266b-4449-92e1-c91087e2680c" path="/var/lib/kubelet/pods/39a490c3-266b-4449-92e1-c91087e2680c/volumes" Mar 07 07:21:15 crc kubenswrapper[4738]: I0307 07:21:15.094883 4738 generic.go:334] "Generic (PLEG): container finished" podID="647b453a-b6ac-4fec-b179-5d0a175bd7df" containerID="c5c192ceb06bd2715e13829524ff465021057a51efdc49629fc92642eac6464c" exitCode=0 Mar 07 07:21:15 crc kubenswrapper[4738]: I0307 07:21:15.094989 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" event={"ID":"647b453a-b6ac-4fec-b179-5d0a175bd7df","Type":"ContainerDied","Data":"c5c192ceb06bd2715e13829524ff465021057a51efdc49629fc92642eac6464c"} Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.464575 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.476864 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.476950 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477039 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477249 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477425 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477494 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbcbr\" (UniqueName: \"kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr\") pod \"647b453a-b6ac-4fec-b179-5d0a175bd7df\" (UID: \"647b453a-b6ac-4fec-b179-5d0a175bd7df\") " Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477692 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.477894 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.478490 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.478541 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/647b453a-b6ac-4fec-b179-5d0a175bd7df-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.489452 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr" (OuterVolumeSpecName: "kube-api-access-gbcbr") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "kube-api-access-gbcbr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.506959 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n"] Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.508421 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.511807 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n"] Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.512054 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts" (OuterVolumeSpecName: "scripts") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.521307 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "647b453a-b6ac-4fec-b179-5d0a175bd7df" (UID: "647b453a-b6ac-4fec-b179-5d0a175bd7df"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.580640 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.580680 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/647b453a-b6ac-4fec-b179-5d0a175bd7df-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.580690 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbcbr\" (UniqueName: \"kubernetes.io/projected/647b453a-b6ac-4fec-b179-5d0a175bd7df-kube-api-access-gbcbr\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:16 crc kubenswrapper[4738]: I0307 07:21:16.580699 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/647b453a-b6ac-4fec-b179-5d0a175bd7df-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:21:17 crc kubenswrapper[4738]: I0307 07:21:17.114044 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14c0c1c4b8fd9e13345a2fc1b292680f8627377aa7b9ca5a4b17fea0169e0cf8" Mar 07 07:21:17 crc kubenswrapper[4738]: I0307 07:21:17.114217 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vbv2n" Mar 07 07:21:18 crc kubenswrapper[4738]: I0307 07:21:18.414601 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="647b453a-b6ac-4fec-b179-5d0a175bd7df" path="/var/lib/kubelet/pods/647b453a-b6ac-4fec-b179-5d0a175bd7df/volumes" Mar 07 07:21:26 crc kubenswrapper[4738]: I0307 07:21:26.957913 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:21:26 crc kubenswrapper[4738]: I0307 07:21:26.958597 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:21:56 crc kubenswrapper[4738]: I0307 07:21:56.957977 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:21:56 crc kubenswrapper[4738]: I0307 07:21:56.958600 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.151409 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547802-n9t6g"] Mar 07 07:22:00 crc kubenswrapper[4738]: E0307 07:22:00.152063 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="647b453a-b6ac-4fec-b179-5d0a175bd7df" containerName="swift-ring-rebalance" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.152079 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="647b453a-b6ac-4fec-b179-5d0a175bd7df" containerName="swift-ring-rebalance" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.152269 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="647b453a-b6ac-4fec-b179-5d0a175bd7df" containerName="swift-ring-rebalance" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.152831 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.154492 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.154934 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.155875 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.159871 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-n9t6g"] Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.199253 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt4dn\" (UniqueName: \"kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn\") pod \"auto-csr-approver-29547802-n9t6g\" (UID: \"8083f68a-c9a9-481e-87a2-c23abb9a44f1\") " pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.300133 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt4dn\" (UniqueName: \"kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn\") pod \"auto-csr-approver-29547802-n9t6g\" (UID: \"8083f68a-c9a9-481e-87a2-c23abb9a44f1\") " pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.317626 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt4dn\" (UniqueName: \"kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn\") pod \"auto-csr-approver-29547802-n9t6g\" (UID: \"8083f68a-c9a9-481e-87a2-c23abb9a44f1\") " pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.468447 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.741791 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-n9t6g"] Mar 07 07:22:00 crc kubenswrapper[4738]: I0307 07:22:00.751841 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:22:01 crc kubenswrapper[4738]: I0307 07:22:01.520779 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" event={"ID":"8083f68a-c9a9-481e-87a2-c23abb9a44f1","Type":"ContainerStarted","Data":"2e314d5340543597ae319453d17c807aa9cc9103707f1122aaf91475c0c8d84f"} Mar 07 07:22:02 crc kubenswrapper[4738]: I0307 07:22:02.531068 4738 generic.go:334] "Generic (PLEG): container finished" podID="8083f68a-c9a9-481e-87a2-c23abb9a44f1" containerID="a639d6ac8983692cebcf643c0a55826fda53291deb7f4a98ff8ab7c5cf11dcac" exitCode=0 Mar 07 07:22:02 crc kubenswrapper[4738]: I0307 07:22:02.531222 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" event={"ID":"8083f68a-c9a9-481e-87a2-c23abb9a44f1","Type":"ContainerDied","Data":"a639d6ac8983692cebcf643c0a55826fda53291deb7f4a98ff8ab7c5cf11dcac"} Mar 07 07:22:03 crc kubenswrapper[4738]: I0307 07:22:03.857905 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:03 crc kubenswrapper[4738]: I0307 07:22:03.967241 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lt4dn\" (UniqueName: \"kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn\") pod \"8083f68a-c9a9-481e-87a2-c23abb9a44f1\" (UID: \"8083f68a-c9a9-481e-87a2-c23abb9a44f1\") " Mar 07 07:22:03 crc kubenswrapper[4738]: I0307 07:22:03.972203 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn" (OuterVolumeSpecName: "kube-api-access-lt4dn") pod "8083f68a-c9a9-481e-87a2-c23abb9a44f1" (UID: "8083f68a-c9a9-481e-87a2-c23abb9a44f1"). InnerVolumeSpecName "kube-api-access-lt4dn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.068765 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lt4dn\" (UniqueName: \"kubernetes.io/projected/8083f68a-c9a9-481e-87a2-c23abb9a44f1-kube-api-access-lt4dn\") on node \"crc\" DevicePath \"\"" Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.551576 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" event={"ID":"8083f68a-c9a9-481e-87a2-c23abb9a44f1","Type":"ContainerDied","Data":"2e314d5340543597ae319453d17c807aa9cc9103707f1122aaf91475c0c8d84f"} Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.551842 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e314d5340543597ae319453d17c807aa9cc9103707f1122aaf91475c0c8d84f" Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.551698 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547802-n9t6g" Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.934608 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-qtj6f"] Mar 07 07:22:04 crc kubenswrapper[4738]: I0307 07:22:04.948117 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547796-qtj6f"] Mar 07 07:22:06 crc kubenswrapper[4738]: I0307 07:22:06.402783 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077a5b42-3ee0-4fcd-90cd-72122279a352" path="/var/lib/kubelet/pods/077a5b42-3ee0-4fcd-90cd-72122279a352/volumes" Mar 07 07:22:26 crc kubenswrapper[4738]: I0307 07:22:26.958273 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:22:26 crc kubenswrapper[4738]: I0307 07:22:26.959215 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:22:26 crc kubenswrapper[4738]: I0307 07:22:26.959317 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:22:26 crc kubenswrapper[4738]: I0307 07:22:26.960437 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:22:26 crc kubenswrapper[4738]: I0307 07:22:26.960573 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246" gracePeriod=600 Mar 07 07:22:27 crc kubenswrapper[4738]: I0307 07:22:27.768126 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246" exitCode=0 Mar 07 07:22:27 crc kubenswrapper[4738]: I0307 07:22:27.768241 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246"} Mar 07 07:22:27 crc kubenswrapper[4738]: I0307 07:22:27.769015 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a"} Mar 07 07:22:27 crc kubenswrapper[4738]: I0307 07:22:27.769053 4738 scope.go:117] "RemoveContainer" containerID="e5bec806593533cac1b1c1cc77e678f82478e03cc5fcc9234057125e94ccf2f7" Mar 07 07:22:44 crc kubenswrapper[4738]: I0307 07:22:44.335478 4738 scope.go:117] "RemoveContainer" containerID="bb3aeb5ab513d22a30fae4da917954a5069995ddc59fe2494f40016ad41743aa" Mar 07 07:22:50 crc kubenswrapper[4738]: E0307 07:22:50.486314 4738 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:56724->38.102.83.51:38985: write tcp 38.102.83.51:56724->38.102.83.51:38985: write: broken pipe Mar 07 07:23:10 crc kubenswrapper[4738]: I0307 07:23:10.002112 4738 prober.go:107] "Probe failed" probeType="Readiness" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 07 07:23:22 crc kubenswrapper[4738]: E0307 07:23:22.135034 4738 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.51:39580->38.102.83.51:38985: read tcp 38.102.83.51:39580->38.102.83.51:38985: read: connection reset by peer Mar 07 07:23:42 crc kubenswrapper[4738]: E0307 07:23:42.618277 4738 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:37900->38.102.83.51:38985: write tcp 38.102.83.51:37900->38.102.83.51:38985: write: broken pipe Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.137470 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547804-b4fzl"] Mar 07 07:24:00 crc kubenswrapper[4738]: E0307 07:24:00.139688 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8083f68a-c9a9-481e-87a2-c23abb9a44f1" containerName="oc" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.139780 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8083f68a-c9a9-481e-87a2-c23abb9a44f1" containerName="oc" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.140021 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8083f68a-c9a9-481e-87a2-c23abb9a44f1" containerName="oc" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.140795 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.142751 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.143810 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.145735 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.152570 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-b4fzl"] Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.255854 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6856\" (UniqueName: \"kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856\") pod \"auto-csr-approver-29547804-b4fzl\" (UID: \"d1a4252b-fc2b-4a13-b058-847d209e04d3\") " pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.358293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6856\" (UniqueName: \"kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856\") pod \"auto-csr-approver-29547804-b4fzl\" (UID: \"d1a4252b-fc2b-4a13-b058-847d209e04d3\") " pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.383013 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6856\" (UniqueName: \"kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856\") pod \"auto-csr-approver-29547804-b4fzl\" (UID: \"d1a4252b-fc2b-4a13-b058-847d209e04d3\") " pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.504229 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:00 crc kubenswrapper[4738]: I0307 07:24:00.948371 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-b4fzl"] Mar 07 07:24:00 crc kubenswrapper[4738]: W0307 07:24:00.958682 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd1a4252b_fc2b_4a13_b058_847d209e04d3.slice/crio-e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627 WatchSource:0}: Error finding container e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627: Status 404 returned error can't find the container with id e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627 Mar 07 07:24:01 crc kubenswrapper[4738]: I0307 07:24:01.747690 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" event={"ID":"d1a4252b-fc2b-4a13-b058-847d209e04d3","Type":"ContainerStarted","Data":"e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627"} Mar 07 07:24:02 crc kubenswrapper[4738]: I0307 07:24:02.759891 4738 generic.go:334] "Generic (PLEG): container finished" podID="d1a4252b-fc2b-4a13-b058-847d209e04d3" containerID="690b06c8febb5ae7141b4cf073703b6f3e5f75bfc471e9734ca36ab160edf53a" exitCode=0 Mar 07 07:24:02 crc kubenswrapper[4738]: I0307 07:24:02.759975 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" event={"ID":"d1a4252b-fc2b-4a13-b058-847d209e04d3","Type":"ContainerDied","Data":"690b06c8febb5ae7141b4cf073703b6f3e5f75bfc471e9734ca36ab160edf53a"} Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.071098 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.121210 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6856\" (UniqueName: \"kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856\") pod \"d1a4252b-fc2b-4a13-b058-847d209e04d3\" (UID: \"d1a4252b-fc2b-4a13-b058-847d209e04d3\") " Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.132878 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856" (OuterVolumeSpecName: "kube-api-access-b6856") pod "d1a4252b-fc2b-4a13-b058-847d209e04d3" (UID: "d1a4252b-fc2b-4a13-b058-847d209e04d3"). InnerVolumeSpecName "kube-api-access-b6856". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.222786 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6856\" (UniqueName: \"kubernetes.io/projected/d1a4252b-fc2b-4a13-b058-847d209e04d3-kube-api-access-b6856\") on node \"crc\" DevicePath \"\"" Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.781842 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" event={"ID":"d1a4252b-fc2b-4a13-b058-847d209e04d3","Type":"ContainerDied","Data":"e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627"} Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.781887 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e100b8c7927eef6903c0ec3e35da76d1759cf24a43e5223fb3ace8f66ce05627" Mar 07 07:24:04 crc kubenswrapper[4738]: I0307 07:24:04.782351 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547804-b4fzl" Mar 07 07:24:05 crc kubenswrapper[4738]: I0307 07:24:05.158451 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-hf5hx"] Mar 07 07:24:05 crc kubenswrapper[4738]: I0307 07:24:05.164929 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547798-hf5hx"] Mar 07 07:24:06 crc kubenswrapper[4738]: I0307 07:24:06.394580 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="64e3d0e4-8776-4fb3-9a4b-4205a9e755c1" path="/var/lib/kubelet/pods/64e3d0e4-8776-4fb3-9a4b-4205a9e755c1/volumes" Mar 07 07:24:44 crc kubenswrapper[4738]: I0307 07:24:44.431687 4738 scope.go:117] "RemoveContainer" containerID="a02e9bf78d4168178809e4f70eefd0ac959530c049ac10ae0083653262e5efa8" Mar 07 07:24:56 crc kubenswrapper[4738]: I0307 07:24:56.958294 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:24:56 crc kubenswrapper[4738]: I0307 07:24:56.958985 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:25:26 crc kubenswrapper[4738]: I0307 07:25:26.957675 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:25:26 crc kubenswrapper[4738]: I0307 07:25:26.958231 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.498876 4738 scope.go:117] "RemoveContainer" containerID="268471195f290d34a1640c6c40556c0cc0df5c38ba265a2054454572ddb1a8aa" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.539875 4738 scope.go:117] "RemoveContainer" containerID="5b8dd2431b82438d7e854cccd8e8b8fcf937aa0db16eb2239920149686134af0" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.576116 4738 scope.go:117] "RemoveContainer" containerID="5c8a691cfe1a8ab8c30a26b23434cdb8ee4f0a45358a201eb4ecef3b2797785a" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.623699 4738 scope.go:117] "RemoveContainer" containerID="b4724281a96bff261edcd51fb75fe7f352935e15dfd29e22addc75c7164d87df" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.664037 4738 scope.go:117] "RemoveContainer" containerID="023ab348f9630e3188399acebb441225f16fe8a3ea0a45562a1416ce00ed2a1d" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.691261 4738 scope.go:117] "RemoveContainer" containerID="a2ffa63c4afb5dd07e9a57782d9c93b48afd95857655b0d188d0b3255e43c57d" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.713757 4738 scope.go:117] "RemoveContainer" containerID="4ac2a985e3cb60a09d256b7974029df113f3080d4297a16f859b41d021df8333" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.737368 4738 scope.go:117] "RemoveContainer" containerID="634c98d8bd723fe8365bb0a579d302058fe17e15dc44a8c89fb077a91f9db720" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.765127 4738 scope.go:117] "RemoveContainer" containerID="7a9f320778051ddf4b7af67e32ddd382aad3976afba27bea7e44cc7f637a1ec6" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.792387 4738 scope.go:117] "RemoveContainer" containerID="0bd0f18099d0d2501aaf5782a927bc5913e1c189ce48a68e47f375657b6799b0" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.816245 4738 scope.go:117] "RemoveContainer" containerID="137defbc952bf81c1f496e28b2c150ef0bf8561eb3634039ffac430db7e04802" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.838039 4738 scope.go:117] "RemoveContainer" containerID="90084ef90d77e89230917bd780cb78df4fafd97840fe50ada61c1c20a43914f9" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.856019 4738 scope.go:117] "RemoveContainer" containerID="c9a84b5ce18a92c7cd8e2aa843f5b3cacd9d67fea7fb130fc739af256a55b82c" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.874150 4738 scope.go:117] "RemoveContainer" containerID="7e3233cc25e094b9420df4a7ee6bad5a6ddbd092e5b08e43d31d29b920a2d134" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.896214 4738 scope.go:117] "RemoveContainer" containerID="5a05ae7d39e3985637e6577343ab598669a43dc85bcd5dd09ddc9b373c1cc837" Mar 07 07:25:44 crc kubenswrapper[4738]: I0307 07:25:44.941752 4738 scope.go:117] "RemoveContainer" containerID="fea746265db6cca797f4aab10f0142d479f000ff811043ba2d786b1386e70f97" Mar 07 07:25:56 crc kubenswrapper[4738]: I0307 07:25:56.957953 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:25:56 crc kubenswrapper[4738]: I0307 07:25:56.958787 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:25:56 crc kubenswrapper[4738]: I0307 07:25:56.958853 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:25:56 crc kubenswrapper[4738]: I0307 07:25:56.959817 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:25:56 crc kubenswrapper[4738]: I0307 07:25:56.959921 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a" gracePeriod=600 Mar 07 07:25:57 crc kubenswrapper[4738]: I0307 07:25:57.813473 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a" exitCode=0 Mar 07 07:25:57 crc kubenswrapper[4738]: I0307 07:25:57.813591 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a"} Mar 07 07:25:57 crc kubenswrapper[4738]: I0307 07:25:57.814509 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15"} Mar 07 07:25:57 crc kubenswrapper[4738]: I0307 07:25:57.814615 4738 scope.go:117] "RemoveContainer" containerID="35a21fe5da4e09d4cd3572da738e94c43773cf6bc3d6e32eacd728f4e0c1f246" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.155405 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547806-mb6pt"] Mar 07 07:26:00 crc kubenswrapper[4738]: E0307 07:26:00.157050 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1a4252b-fc2b-4a13-b058-847d209e04d3" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.157075 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1a4252b-fc2b-4a13-b058-847d209e04d3" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.157283 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1a4252b-fc2b-4a13-b058-847d209e04d3" containerName="oc" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.158069 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.164100 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.165145 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.165609 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.172558 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-mb6pt"] Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.234524 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lgmc\" (UniqueName: \"kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc\") pod \"auto-csr-approver-29547806-mb6pt\" (UID: \"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb\") " pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.336457 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lgmc\" (UniqueName: \"kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc\") pod \"auto-csr-approver-29547806-mb6pt\" (UID: \"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb\") " pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.356061 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lgmc\" (UniqueName: \"kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc\") pod \"auto-csr-approver-29547806-mb6pt\" (UID: \"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb\") " pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.481579 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:00 crc kubenswrapper[4738]: I0307 07:26:00.900120 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-mb6pt"] Mar 07 07:26:00 crc kubenswrapper[4738]: W0307 07:26:00.905415 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda051bc9b_524f_4efb_94c5_06f7cfe2d2cb.slice/crio-a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159 WatchSource:0}: Error finding container a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159: Status 404 returned error can't find the container with id a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159 Mar 07 07:26:01 crc kubenswrapper[4738]: I0307 07:26:01.855313 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" event={"ID":"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb","Type":"ContainerStarted","Data":"a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159"} Mar 07 07:26:02 crc kubenswrapper[4738]: I0307 07:26:02.069022 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/root-account-create-update-j7gqh"] Mar 07 07:26:02 crc kubenswrapper[4738]: I0307 07:26:02.078598 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/root-account-create-update-j7gqh"] Mar 07 07:26:02 crc kubenswrapper[4738]: I0307 07:26:02.396643 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ca5884-f8c3-464b-8621-1e97d77e5083" path="/var/lib/kubelet/pods/63ca5884-f8c3-464b-8621-1e97d77e5083/volumes" Mar 07 07:26:02 crc kubenswrapper[4738]: I0307 07:26:02.869568 4738 generic.go:334] "Generic (PLEG): container finished" podID="a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" containerID="0e720457c242f88560492c41c224f50e15dda393fb14314e4d0aee6533f8e980" exitCode=0 Mar 07 07:26:02 crc kubenswrapper[4738]: I0307 07:26:02.869634 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" event={"ID":"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb","Type":"ContainerDied","Data":"0e720457c242f88560492c41c224f50e15dda393fb14314e4d0aee6533f8e980"} Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.231308 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.299136 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lgmc\" (UniqueName: \"kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc\") pod \"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb\" (UID: \"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb\") " Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.304048 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc" (OuterVolumeSpecName: "kube-api-access-8lgmc") pod "a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" (UID: "a051bc9b-524f-4efb-94c5-06f7cfe2d2cb"). InnerVolumeSpecName "kube-api-access-8lgmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.402202 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lgmc\" (UniqueName: \"kubernetes.io/projected/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb-kube-api-access-8lgmc\") on node \"crc\" DevicePath \"\"" Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.891948 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" event={"ID":"a051bc9b-524f-4efb-94c5-06f7cfe2d2cb","Type":"ContainerDied","Data":"a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159"} Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.891994 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a95950964b52c91502ba3131cab33cefa1426977f6ea17782e89e1e60521a159" Mar 07 07:26:04 crc kubenswrapper[4738]: I0307 07:26:04.892138 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547806-mb6pt" Mar 07 07:26:05 crc kubenswrapper[4738]: I0307 07:26:05.287477 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-92slb"] Mar 07 07:26:05 crc kubenswrapper[4738]: I0307 07:26:05.294887 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547800-92slb"] Mar 07 07:26:06 crc kubenswrapper[4738]: I0307 07:26:06.405294 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c463008-ff10-4f2f-9d74-1b78a30c5720" path="/var/lib/kubelet/pods/6c463008-ff10-4f2f-9d74-1b78a30c5720/volumes" Mar 07 07:26:17 crc kubenswrapper[4738]: E0307 07:26:17.676474 4738 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:35646->38.102.83.51:38985: write tcp 38.102.83.51:35646->38.102.83.51:38985: write: broken pipe Mar 07 07:26:45 crc kubenswrapper[4738]: I0307 07:26:45.008998 4738 scope.go:117] "RemoveContainer" containerID="de970a9fb6f42b417e39ea65d51a5dd6183c4734303d7d6e8f4e06e625946d55" Mar 07 07:26:45 crc kubenswrapper[4738]: I0307 07:26:45.032174 4738 scope.go:117] "RemoveContainer" containerID="a1119f150a952b18a183cdcd8ca742866959d1872e944347aadf7a64dee76b9b" Mar 07 07:26:45 crc kubenswrapper[4738]: I0307 07:26:45.061126 4738 scope.go:117] "RemoveContainer" containerID="df5138da1b2a5108d9af7e375900102cbed7c5796f953e33700da6d1b19fd24d" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.410761 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:11 crc kubenswrapper[4738]: E0307 07:27:11.411781 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" containerName="oc" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.411795 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" containerName="oc" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.411925 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" containerName="oc" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.412941 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.458569 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.556886 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.557063 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9q9\" (UniqueName: \"kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.557104 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.658870 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9q9\" (UniqueName: \"kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.658918 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.658961 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.659411 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.659484 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.693449 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9q9\" (UniqueName: \"kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9\") pod \"certified-operators-fnzxf\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:11 crc kubenswrapper[4738]: I0307 07:27:11.735907 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:12 crc kubenswrapper[4738]: I0307 07:27:12.178271 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:12 crc kubenswrapper[4738]: I0307 07:27:12.496966 4738 generic.go:334] "Generic (PLEG): container finished" podID="0256111f-efc0-4977-a45c-5dccde34a36d" containerID="6bf8d8bfc8cb2b809c45c7f05d26b20c0cb10946526f7cb5aecf229a1a4236c3" exitCode=0 Mar 07 07:27:12 crc kubenswrapper[4738]: I0307 07:27:12.497343 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerDied","Data":"6bf8d8bfc8cb2b809c45c7f05d26b20c0cb10946526f7cb5aecf229a1a4236c3"} Mar 07 07:27:12 crc kubenswrapper[4738]: I0307 07:27:12.497368 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerStarted","Data":"2a07c8f73a65a99b4c49b1068f45b21804569b5a6cd0ad8c3c2b315a9edede8b"} Mar 07 07:27:12 crc kubenswrapper[4738]: I0307 07:27:12.498965 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:27:13 crc kubenswrapper[4738]: I0307 07:27:13.508419 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerStarted","Data":"c0653c28d980140be021de7828460f951200bb31b51d85622a77a4a630789f5b"} Mar 07 07:27:14 crc kubenswrapper[4738]: I0307 07:27:14.519468 4738 generic.go:334] "Generic (PLEG): container finished" podID="0256111f-efc0-4977-a45c-5dccde34a36d" containerID="c0653c28d980140be021de7828460f951200bb31b51d85622a77a4a630789f5b" exitCode=0 Mar 07 07:27:14 crc kubenswrapper[4738]: I0307 07:27:14.519533 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerDied","Data":"c0653c28d980140be021de7828460f951200bb31b51d85622a77a4a630789f5b"} Mar 07 07:27:15 crc kubenswrapper[4738]: I0307 07:27:15.542468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerStarted","Data":"879742868fcc85f0f39b7b481d193195bc984acf0f69806bd23d1671ad443d82"} Mar 07 07:27:15 crc kubenswrapper[4738]: I0307 07:27:15.564569 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-fnzxf" podStartSLOduration=2.073675803 podStartE2EDuration="4.564512665s" podCreationTimestamp="2026-03-07 07:27:11 +0000 UTC" firstStartedPulling="2026-03-07 07:27:12.498791636 +0000 UTC m=+1650.963778957" lastFinishedPulling="2026-03-07 07:27:14.989628498 +0000 UTC m=+1653.454615819" observedRunningTime="2026-03-07 07:27:15.562880449 +0000 UTC m=+1654.027867790" watchObservedRunningTime="2026-03-07 07:27:15.564512665 +0000 UTC m=+1654.029499996" Mar 07 07:27:21 crc kubenswrapper[4738]: I0307 07:27:21.736564 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:21 crc kubenswrapper[4738]: I0307 07:27:21.737215 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:21 crc kubenswrapper[4738]: I0307 07:27:21.807286 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:22 crc kubenswrapper[4738]: I0307 07:27:22.714999 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:22 crc kubenswrapper[4738]: I0307 07:27:22.764921 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:24 crc kubenswrapper[4738]: I0307 07:27:24.677617 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-fnzxf" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="registry-server" containerID="cri-o://879742868fcc85f0f39b7b481d193195bc984acf0f69806bd23d1671ad443d82" gracePeriod=2 Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.045786 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc"] Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.052602 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-create-rv7jl"] Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.059383 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-c2f4-account-create-update-96rbc"] Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.066495 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-create-rv7jl"] Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.399695 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56eba8ea-c92b-40b6-84bf-b37e6327176f" path="/var/lib/kubelet/pods/56eba8ea-c92b-40b6-84bf-b37e6327176f/volumes" Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.400845 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b35e853f-e4d7-4cea-be6c-e2405e776b20" path="/var/lib/kubelet/pods/b35e853f-e4d7-4cea-be6c-e2405e776b20/volumes" Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.696359 4738 generic.go:334] "Generic (PLEG): container finished" podID="0256111f-efc0-4977-a45c-5dccde34a36d" containerID="879742868fcc85f0f39b7b481d193195bc984acf0f69806bd23d1671ad443d82" exitCode=0 Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.696517 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerDied","Data":"879742868fcc85f0f39b7b481d193195bc984acf0f69806bd23d1671ad443d82"} Mar 07 07:27:26 crc kubenswrapper[4738]: I0307 07:27:26.889350 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.033718 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r9q9\" (UniqueName: \"kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9\") pod \"0256111f-efc0-4977-a45c-5dccde34a36d\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.034128 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities\") pod \"0256111f-efc0-4977-a45c-5dccde34a36d\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.034236 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content\") pod \"0256111f-efc0-4977-a45c-5dccde34a36d\" (UID: \"0256111f-efc0-4977-a45c-5dccde34a36d\") " Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.037833 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities" (OuterVolumeSpecName: "utilities") pod "0256111f-efc0-4977-a45c-5dccde34a36d" (UID: "0256111f-efc0-4977-a45c-5dccde34a36d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.041838 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9" (OuterVolumeSpecName: "kube-api-access-7r9q9") pod "0256111f-efc0-4977-a45c-5dccde34a36d" (UID: "0256111f-efc0-4977-a45c-5dccde34a36d"). InnerVolumeSpecName "kube-api-access-7r9q9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.103548 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0256111f-efc0-4977-a45c-5dccde34a36d" (UID: "0256111f-efc0-4977-a45c-5dccde34a36d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.136228 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7r9q9\" (UniqueName: \"kubernetes.io/projected/0256111f-efc0-4977-a45c-5dccde34a36d-kube-api-access-7r9q9\") on node \"crc\" DevicePath \"\"" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.136537 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.136665 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0256111f-efc0-4977-a45c-5dccde34a36d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.711785 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-fnzxf" event={"ID":"0256111f-efc0-4977-a45c-5dccde34a36d","Type":"ContainerDied","Data":"2a07c8f73a65a99b4c49b1068f45b21804569b5a6cd0ad8c3c2b315a9edede8b"} Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.711892 4738 scope.go:117] "RemoveContainer" containerID="879742868fcc85f0f39b7b481d193195bc984acf0f69806bd23d1671ad443d82" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.712520 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-fnzxf" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.743702 4738 scope.go:117] "RemoveContainer" containerID="c0653c28d980140be021de7828460f951200bb31b51d85622a77a4a630789f5b" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.776228 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.781555 4738 scope.go:117] "RemoveContainer" containerID="6bf8d8bfc8cb2b809c45c7f05d26b20c0cb10946526f7cb5aecf229a1a4236c3" Mar 07 07:27:27 crc kubenswrapper[4738]: I0307 07:27:27.795892 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-fnzxf"] Mar 07 07:27:28 crc kubenswrapper[4738]: I0307 07:27:28.398891 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" path="/var/lib/kubelet/pods/0256111f-efc0-4977-a45c-5dccde34a36d/volumes" Mar 07 07:27:40 crc kubenswrapper[4738]: I0307 07:27:40.003501 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 07 07:27:41 crc kubenswrapper[4738]: I0307 07:27:41.044754 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-4d4vj"] Mar 07 07:27:41 crc kubenswrapper[4738]: I0307 07:27:41.053022 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-db-sync-4d4vj"] Mar 07 07:27:42 crc kubenswrapper[4738]: I0307 07:27:42.399927 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee" path="/var/lib/kubelet/pods/6f13db65-a5b3-4f0d-bdc1-2d728eaa60ee/volumes" Mar 07 07:27:45 crc kubenswrapper[4738]: I0307 07:27:45.188664 4738 scope.go:117] "RemoveContainer" containerID="411a532818975637abe917bde52cbf4f704e808b4c662524cf9287ac5c272f82" Mar 07 07:27:45 crc kubenswrapper[4738]: I0307 07:27:45.245638 4738 scope.go:117] "RemoveContainer" containerID="7fa14d51a8680009a813b1fcffd1385164c6dc55cef6e173c734467b48a41482" Mar 07 07:27:45 crc kubenswrapper[4738]: I0307 07:27:45.281624 4738 scope.go:117] "RemoveContainer" containerID="7693ad0835e1960d2671c5be0e94b2c12b5852315d22f365a14b62b8b764fa9c" Mar 07 07:27:45 crc kubenswrapper[4738]: I0307 07:27:45.311289 4738 scope.go:117] "RemoveContainer" containerID="c5c192ceb06bd2715e13829524ff465021057a51efdc49629fc92642eac6464c" Mar 07 07:27:45 crc kubenswrapper[4738]: I0307 07:27:45.360566 4738 scope.go:117] "RemoveContainer" containerID="3aa24e37d0d3312d3218ee6e4cef6c0b1eb41c0b0f78abcc469c1f6012c723ea" Mar 07 07:27:47 crc kubenswrapper[4738]: I0307 07:27:47.028464 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-b44m6"] Mar 07 07:27:47 crc kubenswrapper[4738]: I0307 07:27:47.033906 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/keystone-bootstrap-b44m6"] Mar 07 07:27:48 crc kubenswrapper[4738]: I0307 07:27:48.415758 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44b010f6-1ee6-4660-a1dc-83507adb2ba6" path="/var/lib/kubelet/pods/44b010f6-1ee6-4660-a1dc-83507adb2ba6/volumes" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.431035 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:27:57 crc kubenswrapper[4738]: E0307 07:27:57.431965 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="registry-server" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.431983 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="registry-server" Mar 07 07:27:57 crc kubenswrapper[4738]: E0307 07:27:57.431996 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="extract-content" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.432004 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="extract-content" Mar 07 07:27:57 crc kubenswrapper[4738]: E0307 07:27:57.432016 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="extract-utilities" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.432024 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="extract-utilities" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.432209 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0256111f-efc0-4977-a45c-5dccde34a36d" containerName="registry-server" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.433258 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.446606 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.630726 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.630857 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.630905 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp7g2\" (UniqueName: \"kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.732241 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.732293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp7g2\" (UniqueName: \"kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.732337 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.732757 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.732789 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.751578 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp7g2\" (UniqueName: \"kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2\") pod \"community-operators-qrhw7\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:57 crc kubenswrapper[4738]: I0307 07:27:57.766450 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:27:58 crc kubenswrapper[4738]: I0307 07:27:58.046424 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:27:58 crc kubenswrapper[4738]: I0307 07:27:58.988696 4738 generic.go:334] "Generic (PLEG): container finished" podID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerID="92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2" exitCode=0 Mar 07 07:27:58 crc kubenswrapper[4738]: I0307 07:27:58.988775 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerDied","Data":"92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2"} Mar 07 07:27:58 crc kubenswrapper[4738]: I0307 07:27:58.989043 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerStarted","Data":"8d9b774e2dbaa3f0749221143da1369d2a1d0c4409d5c07c89e6d8a36af094c0"} Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.001993 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerStarted","Data":"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2"} Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.138241 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547808-ggx8q"] Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.139280 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.145640 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.145892 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.146082 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.147548 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-ggx8q"] Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.170984 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6sq8\" (UniqueName: \"kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8\") pod \"auto-csr-approver-29547808-ggx8q\" (UID: \"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41\") " pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.272538 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6sq8\" (UniqueName: \"kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8\") pod \"auto-csr-approver-29547808-ggx8q\" (UID: \"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41\") " pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.291372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6sq8\" (UniqueName: \"kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8\") pod \"auto-csr-approver-29547808-ggx8q\" (UID: \"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41\") " pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.457289 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:00 crc kubenswrapper[4738]: I0307 07:28:00.883252 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-ggx8q"] Mar 07 07:28:00 crc kubenswrapper[4738]: W0307 07:28:00.898561 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b1cc776_2076_4e7a_9ace_ca1ca6abaa41.slice/crio-76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4 WatchSource:0}: Error finding container 76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4: Status 404 returned error can't find the container with id 76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4 Mar 07 07:28:01 crc kubenswrapper[4738]: I0307 07:28:01.014899 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" event={"ID":"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41","Type":"ContainerStarted","Data":"76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4"} Mar 07 07:28:01 crc kubenswrapper[4738]: I0307 07:28:01.017923 4738 generic.go:334] "Generic (PLEG): container finished" podID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerID="2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2" exitCode=0 Mar 07 07:28:01 crc kubenswrapper[4738]: I0307 07:28:01.018015 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerDied","Data":"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2"} Mar 07 07:28:02 crc kubenswrapper[4738]: I0307 07:28:02.030372 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerStarted","Data":"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2"} Mar 07 07:28:02 crc kubenswrapper[4738]: I0307 07:28:02.053879 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qrhw7" podStartSLOduration=2.435257547 podStartE2EDuration="5.05385611s" podCreationTimestamp="2026-03-07 07:27:57 +0000 UTC" firstStartedPulling="2026-03-07 07:27:58.990963493 +0000 UTC m=+1697.455950844" lastFinishedPulling="2026-03-07 07:28:01.609562046 +0000 UTC m=+1700.074549407" observedRunningTime="2026-03-07 07:28:02.045763462 +0000 UTC m=+1700.510750783" watchObservedRunningTime="2026-03-07 07:28:02.05385611 +0000 UTC m=+1700.518843431" Mar 07 07:28:03 crc kubenswrapper[4738]: I0307 07:28:03.038127 4738 generic.go:334] "Generic (PLEG): container finished" podID="3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" containerID="7918b0bfc4e17a97adc26916733e2f5bee3ee3c2589e235b72fc6e8f46c6fbcc" exitCode=0 Mar 07 07:28:03 crc kubenswrapper[4738]: I0307 07:28:03.038278 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" event={"ID":"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41","Type":"ContainerDied","Data":"7918b0bfc4e17a97adc26916733e2f5bee3ee3c2589e235b72fc6e8f46c6fbcc"} Mar 07 07:28:04 crc kubenswrapper[4738]: I0307 07:28:04.468920 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:04 crc kubenswrapper[4738]: I0307 07:28:04.647188 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6sq8\" (UniqueName: \"kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8\") pod \"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41\" (UID: \"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41\") " Mar 07 07:28:04 crc kubenswrapper[4738]: I0307 07:28:04.655120 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8" (OuterVolumeSpecName: "kube-api-access-c6sq8") pod "3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" (UID: "3b1cc776-2076-4e7a-9ace-ca1ca6abaa41"). InnerVolumeSpecName "kube-api-access-c6sq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:28:04 crc kubenswrapper[4738]: I0307 07:28:04.749272 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6sq8\" (UniqueName: \"kubernetes.io/projected/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41-kube-api-access-c6sq8\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:05 crc kubenswrapper[4738]: I0307 07:28:05.088004 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" event={"ID":"3b1cc776-2076-4e7a-9ace-ca1ca6abaa41","Type":"ContainerDied","Data":"76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4"} Mar 07 07:28:05 crc kubenswrapper[4738]: I0307 07:28:05.088060 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76dc6a231fc901bed0611d9e497f1c510259e8d11906ef972f704626df1b8aa4" Mar 07 07:28:05 crc kubenswrapper[4738]: I0307 07:28:05.088143 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547808-ggx8q" Mar 07 07:28:05 crc kubenswrapper[4738]: I0307 07:28:05.545259 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-n9t6g"] Mar 07 07:28:05 crc kubenswrapper[4738]: I0307 07:28:05.552042 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547802-n9t6g"] Mar 07 07:28:06 crc kubenswrapper[4738]: I0307 07:28:06.398768 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8083f68a-c9a9-481e-87a2-c23abb9a44f1" path="/var/lib/kubelet/pods/8083f68a-c9a9-481e-87a2-c23abb9a44f1/volumes" Mar 07 07:28:07 crc kubenswrapper[4738]: I0307 07:28:07.767216 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:07 crc kubenswrapper[4738]: I0307 07:28:07.767308 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:07 crc kubenswrapper[4738]: I0307 07:28:07.854996 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:08 crc kubenswrapper[4738]: I0307 07:28:08.178534 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:08 crc kubenswrapper[4738]: I0307 07:28:08.244809 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.131396 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qrhw7" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="registry-server" containerID="cri-o://5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2" gracePeriod=2 Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.854000 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.872633 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities\") pod \"86b95e63-e47c-4bc8-8739-dd77cfef579b\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.872748 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sp7g2\" (UniqueName: \"kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2\") pod \"86b95e63-e47c-4bc8-8739-dd77cfef579b\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.872789 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content\") pod \"86b95e63-e47c-4bc8-8739-dd77cfef579b\" (UID: \"86b95e63-e47c-4bc8-8739-dd77cfef579b\") " Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.878985 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2" (OuterVolumeSpecName: "kube-api-access-sp7g2") pod "86b95e63-e47c-4bc8-8739-dd77cfef579b" (UID: "86b95e63-e47c-4bc8-8739-dd77cfef579b"). InnerVolumeSpecName "kube-api-access-sp7g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.891531 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities" (OuterVolumeSpecName: "utilities") pod "86b95e63-e47c-4bc8-8739-dd77cfef579b" (UID: "86b95e63-e47c-4bc8-8739-dd77cfef579b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.959043 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86b95e63-e47c-4bc8-8739-dd77cfef579b" (UID: "86b95e63-e47c-4bc8-8739-dd77cfef579b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.974832 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.974877 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sp7g2\" (UniqueName: \"kubernetes.io/projected/86b95e63-e47c-4bc8-8739-dd77cfef579b-kube-api-access-sp7g2\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:10 crc kubenswrapper[4738]: I0307 07:28:10.974903 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86b95e63-e47c-4bc8-8739-dd77cfef579b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.142632 4738 generic.go:334] "Generic (PLEG): container finished" podID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerID="5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2" exitCode=0 Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.142691 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerDied","Data":"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2"} Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.142747 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qrhw7" event={"ID":"86b95e63-e47c-4bc8-8739-dd77cfef579b","Type":"ContainerDied","Data":"8d9b774e2dbaa3f0749221143da1369d2a1d0c4409d5c07c89e6d8a36af094c0"} Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.142724 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qrhw7" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.142770 4738 scope.go:117] "RemoveContainer" containerID="5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.163311 4738 scope.go:117] "RemoveContainer" containerID="2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.188789 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.195789 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qrhw7"] Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.220666 4738 scope.go:117] "RemoveContainer" containerID="92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.249126 4738 scope.go:117] "RemoveContainer" containerID="5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2" Mar 07 07:28:11 crc kubenswrapper[4738]: E0307 07:28:11.250089 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2\": container with ID starting with 5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2 not found: ID does not exist" containerID="5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.250135 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2"} err="failed to get container status \"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2\": rpc error: code = NotFound desc = could not find container \"5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2\": container with ID starting with 5da2cbb017921765530f7a7c4beacd7e24c642bdd57e833e233f3caac583bae2 not found: ID does not exist" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.250179 4738 scope.go:117] "RemoveContainer" containerID="2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2" Mar 07 07:28:11 crc kubenswrapper[4738]: E0307 07:28:11.251330 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2\": container with ID starting with 2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2 not found: ID does not exist" containerID="2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.251411 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2"} err="failed to get container status \"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2\": rpc error: code = NotFound desc = could not find container \"2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2\": container with ID starting with 2173b3870eee907e49509cbed9b5285cd6f742e4c0f408c45af47c0944a2dbb2 not found: ID does not exist" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.251546 4738 scope.go:117] "RemoveContainer" containerID="92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2" Mar 07 07:28:11 crc kubenswrapper[4738]: E0307 07:28:11.251891 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2\": container with ID starting with 92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2 not found: ID does not exist" containerID="92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2" Mar 07 07:28:11 crc kubenswrapper[4738]: I0307 07:28:11.251921 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2"} err="failed to get container status \"92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2\": rpc error: code = NotFound desc = could not find container \"92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2\": container with ID starting with 92adeae4c1076d8a4290e3192e3b81e8b713e181170b90bd94bcade9132c6aa2 not found: ID does not exist" Mar 07 07:28:12 crc kubenswrapper[4738]: I0307 07:28:12.398892 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" path="/var/lib/kubelet/pods/86b95e63-e47c-4bc8-8739-dd77cfef579b/volumes" Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.035715 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx"] Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.043673 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/barbican-db-create-58f9g"] Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.048462 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-db-create-58f9g"] Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.052762 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/barbican-0d0a-account-create-update-btdzx"] Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.400352 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b5da009-da83-4df7-adfb-5bac20ddac17" path="/var/lib/kubelet/pods/7b5da009-da83-4df7-adfb-5bac20ddac17/volumes" Mar 07 07:28:16 crc kubenswrapper[4738]: I0307 07:28:16.401531 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af3e20dd-feb2-4cd0-8745-4a6662a79f86" path="/var/lib/kubelet/pods/af3e20dd-feb2-4cd0-8745-4a6662a79f86/volumes" Mar 07 07:28:26 crc kubenswrapper[4738]: I0307 07:28:26.957767 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:28:26 crc kubenswrapper[4738]: I0307 07:28:26.958288 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:28:45 crc kubenswrapper[4738]: I0307 07:28:45.492416 4738 scope.go:117] "RemoveContainer" containerID="f07d0ca50a2495e02b8e0c7621e1c001bb46286e1026f9643f6ae7d1a829a56d" Mar 07 07:28:45 crc kubenswrapper[4738]: I0307 07:28:45.532816 4738 scope.go:117] "RemoveContainer" containerID="58dd40f7c880c5a7414a3dbfd169f1ac50f2fcbfac8ab42ffdc5abc57d1974e5" Mar 07 07:28:45 crc kubenswrapper[4738]: I0307 07:28:45.564275 4738 scope.go:117] "RemoveContainer" containerID="a639d6ac8983692cebcf643c0a55826fda53291deb7f4a98ff8ab7c5cf11dcac" Mar 07 07:28:45 crc kubenswrapper[4738]: I0307 07:28:45.610409 4738 scope.go:117] "RemoveContainer" containerID="903ef9e3e019fceb511f26aac2b17b266248756bd97a84e411e203273d559ca2" Mar 07 07:28:56 crc kubenswrapper[4738]: I0307 07:28:56.957915 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:28:56 crc kubenswrapper[4738]: I0307 07:28:56.958663 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:29:26 crc kubenswrapper[4738]: I0307 07:29:26.958202 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:29:26 crc kubenswrapper[4738]: I0307 07:29:26.958876 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:29:26 crc kubenswrapper[4738]: I0307 07:29:26.958941 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:29:26 crc kubenswrapper[4738]: I0307 07:29:26.959911 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:29:26 crc kubenswrapper[4738]: I0307 07:29:26.960009 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" gracePeriod=600 Mar 07 07:29:27 crc kubenswrapper[4738]: E0307 07:29:27.087183 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:29:27 crc kubenswrapper[4738]: I0307 07:29:27.792972 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" exitCode=0 Mar 07 07:29:27 crc kubenswrapper[4738]: I0307 07:29:27.793052 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15"} Mar 07 07:29:27 crc kubenswrapper[4738]: I0307 07:29:27.793112 4738 scope.go:117] "RemoveContainer" containerID="e68d48226357ef44bc0d64b15acc6a8bc6f19bd23144aa64037a10caf100ed7a" Mar 07 07:29:27 crc kubenswrapper[4738]: I0307 07:29:27.794283 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:29:27 crc kubenswrapper[4738]: E0307 07:29:27.794727 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:29:40 crc kubenswrapper[4738]: I0307 07:29:40.385651 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:29:40 crc kubenswrapper[4738]: E0307 07:29:40.386670 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:29:51 crc kubenswrapper[4738]: I0307 07:29:51.385910 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:29:51 crc kubenswrapper[4738]: E0307 07:29:51.387917 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.155126 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547810-4h5qr"] Mar 07 07:30:00 crc kubenswrapper[4738]: E0307 07:30:00.156225 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="extract-content" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156249 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="extract-content" Mar 07 07:30:00 crc kubenswrapper[4738]: E0307 07:30:00.156286 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="extract-utilities" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156299 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="extract-utilities" Mar 07 07:30:00 crc kubenswrapper[4738]: E0307 07:30:00.156323 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" containerName="oc" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156335 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" containerName="oc" Mar 07 07:30:00 crc kubenswrapper[4738]: E0307 07:30:00.156360 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156370 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156588 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="86b95e63-e47c-4bc8-8739-dd77cfef579b" containerName="registry-server" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.156615 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" containerName="oc" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.157429 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.159037 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.161458 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.161706 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.165090 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl"] Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.165974 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.167915 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.167989 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.177952 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-4h5qr"] Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.204283 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl"] Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.249752 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.250091 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhrfr\" (UniqueName: \"kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr\") pod \"auto-csr-approver-29547810-4h5qr\" (UID: \"d04ed723-5f2f-402a-96c8-7249338d43ec\") " pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.250237 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmjn4\" (UniqueName: \"kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.250546 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.352071 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.352197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.352231 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhrfr\" (UniqueName: \"kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr\") pod \"auto-csr-approver-29547810-4h5qr\" (UID: \"d04ed723-5f2f-402a-96c8-7249338d43ec\") " pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.352251 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmjn4\" (UniqueName: \"kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.353296 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.368059 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.368417 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhrfr\" (UniqueName: \"kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr\") pod \"auto-csr-approver-29547810-4h5qr\" (UID: \"d04ed723-5f2f-402a-96c8-7249338d43ec\") " pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.370363 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmjn4\" (UniqueName: \"kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4\") pod \"collect-profiles-29547810-t59gl\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.481014 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.495998 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.903509 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-4h5qr"] Mar 07 07:30:00 crc kubenswrapper[4738]: W0307 07:30:00.905875 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd04ed723_5f2f_402a_96c8_7249338d43ec.slice/crio-9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11 WatchSource:0}: Error finding container 9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11: Status 404 returned error can't find the container with id 9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11 Mar 07 07:30:00 crc kubenswrapper[4738]: I0307 07:30:00.944704 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl"] Mar 07 07:30:00 crc kubenswrapper[4738]: W0307 07:30:00.954011 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf3f1996_9df4_42bc_9ce4_1d700ba817a6.slice/crio-02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4 WatchSource:0}: Error finding container 02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4: Status 404 returned error can't find the container with id 02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4 Mar 07 07:30:01 crc kubenswrapper[4738]: I0307 07:30:01.070039 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" event={"ID":"d04ed723-5f2f-402a-96c8-7249338d43ec","Type":"ContainerStarted","Data":"9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11"} Mar 07 07:30:01 crc kubenswrapper[4738]: I0307 07:30:01.071553 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" event={"ID":"df3f1996-9df4-42bc-9ce4-1d700ba817a6","Type":"ContainerStarted","Data":"02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4"} Mar 07 07:30:02 crc kubenswrapper[4738]: I0307 07:30:02.080116 4738 generic.go:334] "Generic (PLEG): container finished" podID="df3f1996-9df4-42bc-9ce4-1d700ba817a6" containerID="a247c6858783ef800293c70f17a6d0c3307489cf5026526e663b74509ef024f4" exitCode=0 Mar 07 07:30:02 crc kubenswrapper[4738]: I0307 07:30:02.080206 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" event={"ID":"df3f1996-9df4-42bc-9ce4-1d700ba817a6","Type":"ContainerDied","Data":"a247c6858783ef800293c70f17a6d0c3307489cf5026526e663b74509ef024f4"} Mar 07 07:30:02 crc kubenswrapper[4738]: I0307 07:30:02.406265 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:30:02 crc kubenswrapper[4738]: E0307 07:30:02.407465 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.094135 4738 generic.go:334] "Generic (PLEG): container finished" podID="d04ed723-5f2f-402a-96c8-7249338d43ec" containerID="58d77757efa2e302020f8962dbd35434698242e8e66abb6fd854537f3f5dc7a0" exitCode=0 Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.094555 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" event={"ID":"d04ed723-5f2f-402a-96c8-7249338d43ec","Type":"ContainerDied","Data":"58d77757efa2e302020f8962dbd35434698242e8e66abb6fd854537f3f5dc7a0"} Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.361970 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.498915 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume\") pod \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.499248 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmjn4\" (UniqueName: \"kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4\") pod \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.499595 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume\") pod \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\" (UID: \"df3f1996-9df4-42bc-9ce4-1d700ba817a6\") " Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.500786 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume" (OuterVolumeSpecName: "config-volume") pod "df3f1996-9df4-42bc-9ce4-1d700ba817a6" (UID: "df3f1996-9df4-42bc-9ce4-1d700ba817a6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.502611 4738 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/df3f1996-9df4-42bc-9ce4-1d700ba817a6-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.505454 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4" (OuterVolumeSpecName: "kube-api-access-jmjn4") pod "df3f1996-9df4-42bc-9ce4-1d700ba817a6" (UID: "df3f1996-9df4-42bc-9ce4-1d700ba817a6"). InnerVolumeSpecName "kube-api-access-jmjn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.506333 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "df3f1996-9df4-42bc-9ce4-1d700ba817a6" (UID: "df3f1996-9df4-42bc-9ce4-1d700ba817a6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.605324 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmjn4\" (UniqueName: \"kubernetes.io/projected/df3f1996-9df4-42bc-9ce4-1d700ba817a6-kube-api-access-jmjn4\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:03 crc kubenswrapper[4738]: I0307 07:30:03.605421 4738 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/df3f1996-9df4-42bc-9ce4-1d700ba817a6-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.102311 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.102309 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547810-t59gl" event={"ID":"df3f1996-9df4-42bc-9ce4-1d700ba817a6","Type":"ContainerDied","Data":"02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4"} Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.102769 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02a4bdb4770b9be0302b0b544efaa5ed73b4d43b8f620a4e3ee588cc80e524b4" Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.406669 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.519215 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhrfr\" (UniqueName: \"kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr\") pod \"d04ed723-5f2f-402a-96c8-7249338d43ec\" (UID: \"d04ed723-5f2f-402a-96c8-7249338d43ec\") " Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.523278 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr" (OuterVolumeSpecName: "kube-api-access-xhrfr") pod "d04ed723-5f2f-402a-96c8-7249338d43ec" (UID: "d04ed723-5f2f-402a-96c8-7249338d43ec"). InnerVolumeSpecName "kube-api-access-xhrfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:30:04 crc kubenswrapper[4738]: I0307 07:30:04.621015 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhrfr\" (UniqueName: \"kubernetes.io/projected/d04ed723-5f2f-402a-96c8-7249338d43ec-kube-api-access-xhrfr\") on node \"crc\" DevicePath \"\"" Mar 07 07:30:05 crc kubenswrapper[4738]: I0307 07:30:05.110134 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" event={"ID":"d04ed723-5f2f-402a-96c8-7249338d43ec","Type":"ContainerDied","Data":"9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11"} Mar 07 07:30:05 crc kubenswrapper[4738]: I0307 07:30:05.110191 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547810-4h5qr" Mar 07 07:30:05 crc kubenswrapper[4738]: I0307 07:30:05.110192 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9049d925d2d3e8978e62bc450ed3e46fbfad0c55de38ac2387fae06270e80c11" Mar 07 07:30:05 crc kubenswrapper[4738]: I0307 07:30:05.466790 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-b4fzl"] Mar 07 07:30:05 crc kubenswrapper[4738]: I0307 07:30:05.473024 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547804-b4fzl"] Mar 07 07:30:06 crc kubenswrapper[4738]: I0307 07:30:06.394727 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1a4252b-fc2b-4a13-b058-847d209e04d3" path="/var/lib/kubelet/pods/d1a4252b-fc2b-4a13-b058-847d209e04d3/volumes" Mar 07 07:30:13 crc kubenswrapper[4738]: I0307 07:30:13.386046 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:30:13 crc kubenswrapper[4738]: E0307 07:30:13.387187 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:30:14 crc kubenswrapper[4738]: E0307 07:30:14.608463 4738 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.51:46662->38.102.83.51:38985: write tcp 38.102.83.51:46662->38.102.83.51:38985: write: connection reset by peer Mar 07 07:30:25 crc kubenswrapper[4738]: I0307 07:30:25.385934 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:30:25 crc kubenswrapper[4738]: E0307 07:30:25.388140 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:30:37 crc kubenswrapper[4738]: I0307 07:30:37.385490 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:30:37 crc kubenswrapper[4738]: E0307 07:30:37.386364 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:30:45 crc kubenswrapper[4738]: I0307 07:30:45.730117 4738 scope.go:117] "RemoveContainer" containerID="690b06c8febb5ae7141b4cf073703b6f3e5f75bfc471e9734ca36ab160edf53a" Mar 07 07:30:49 crc kubenswrapper[4738]: I0307 07:30:49.385968 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:30:49 crc kubenswrapper[4738]: E0307 07:30:49.387223 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:04 crc kubenswrapper[4738]: I0307 07:31:04.385945 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:31:04 crc kubenswrapper[4738]: E0307 07:31:04.386651 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:19 crc kubenswrapper[4738]: I0307 07:31:19.386033 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:31:19 crc kubenswrapper[4738]: E0307 07:31:19.386675 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.455793 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s"] Mar 07 07:31:20 crc kubenswrapper[4738]: E0307 07:31:20.456480 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04ed723-5f2f-402a-96c8-7249338d43ec" containerName="oc" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.456498 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04ed723-5f2f-402a-96c8-7249338d43ec" containerName="oc" Mar 07 07:31:20 crc kubenswrapper[4738]: E0307 07:31:20.456528 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df3f1996-9df4-42bc-9ce4-1d700ba817a6" containerName="collect-profiles" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.456539 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="df3f1996-9df4-42bc-9ce4-1d700ba817a6" containerName="collect-profiles" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.456710 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="df3f1996-9df4-42bc-9ce4-1d700ba817a6" containerName="collect-profiles" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.456727 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04ed723-5f2f-402a-96c8-7249338d43ec" containerName="oc" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.457336 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.459631 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.459644 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.469126 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s"] Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.564546 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.564714 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.564802 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j7hl\" (UniqueName: \"kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.565021 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.565216 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.565294 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666700 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666806 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j7hl\" (UniqueName: \"kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666841 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666895 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666931 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.666978 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.667597 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.667792 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.668318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.672092 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.672201 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.687815 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j7hl\" (UniqueName: \"kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl\") pod \"swift-ring-rebalance-debug-5ms7s\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:20 crc kubenswrapper[4738]: I0307 07:31:20.779945 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.058273 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.569758 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.589137 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.590601 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.610793 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.616002 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.636626 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.689807 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz2lg\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-kube-api-access-lz2lg\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.689864 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-etc-swift\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.689913 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-lock\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.689968 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.690038 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-cache\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792003 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz2lg\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-kube-api-access-lz2lg\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792075 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-etc-swift\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792133 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-etc-swift\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792194 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-lock\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792240 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9b92\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-kube-api-access-j9b92\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792293 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-cache\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792352 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792388 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792724 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") device mount path \"/mnt/openstack/pv11\"" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792866 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-lock\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.792919 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-lock\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.793097 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-cache\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.793571 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/04bf7f33-d023-4bab-be79-70348bf80391-cache\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.803639 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-etc-swift\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.814215 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz2lg\" (UniqueName: \"kubernetes.io/projected/04bf7f33-d023-4bab-be79-70348bf80391-kube-api-access-lz2lg\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.826389 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"swift-storage-1\" (UID: \"04bf7f33-d023-4bab-be79-70348bf80391\") " pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.894759 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-lock\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.894920 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-etc-swift\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.894978 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9b92\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-kube-api-access-j9b92\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.895016 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-cache\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.895060 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.895542 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") device mount path \"/mnt/openstack/pv02\"" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.896270 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-lock\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.899047 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-cache\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.904496 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-etc-swift\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.918847 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9b92\" (UniqueName: \"kubernetes.io/projected/b8c4676c-c7a7-46e8-95d0-ec6656e32fe1-kube-api-access-j9b92\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.919532 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" event={"ID":"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a","Type":"ContainerStarted","Data":"492007510e067b8cb79fae26b464af1b5d45fe64ec7f02f6ea32c00ad8edafe9"} Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.919586 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" event={"ID":"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a","Type":"ContainerStarted","Data":"be8eb3880d2ad341d21846570b031e667349cf75947707d8ddf2b018e2b4667c"} Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.937385 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-2\" (UID: \"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1\") " pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.945451 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-1" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.952713 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-ckq94"] Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.955189 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-2" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.963267 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" podStartSLOduration=1.963241183 podStartE2EDuration="1.963241183s" podCreationTimestamp="2026-03-07 07:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:21.945787332 +0000 UTC m=+1900.410774653" watchObservedRunningTime="2026-03-07 07:31:21.963241183 +0000 UTC m=+1900.428228514" Mar 07 07:31:21 crc kubenswrapper[4738]: I0307 07:31:21.977337 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-ckq94"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.011131 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-8czg5"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.011989 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.023758 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-8czg5"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.088352 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rj46d"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.094411 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.099585 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rj46d"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.103839 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.103893 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.103926 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.103971 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.103994 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.104053 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktk2r\" (UniqueName: \"kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205717 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205763 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205802 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-run-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205820 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-etc-swift\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205840 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-log-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205869 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktk2r\" (UniqueName: \"kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205889 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128745d7-e885-48bf-88fe-22bacab590d5-config-data\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205911 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k57nx\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-kube-api-access-k57nx\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205971 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.205996 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.206016 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.206391 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.207293 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.208029 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.212029 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.212329 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.222813 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktk2r\" (UniqueName: \"kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r\") pod \"swift-ring-rebalance-8czg5\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.307589 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-run-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308422 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-etc-swift\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308380 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-run-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308487 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-log-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308517 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128745d7-e885-48bf-88fe-22bacab590d5-config-data\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308757 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/128745d7-e885-48bf-88fe-22bacab590d5-log-httpd\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.308807 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k57nx\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-kube-api-access-k57nx\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.313396 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-etc-swift\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.315132 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/128745d7-e885-48bf-88fe-22bacab590d5-config-data\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.324372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k57nx\" (UniqueName: \"kubernetes.io/projected/128745d7-e885-48bf-88fe-22bacab590d5-kube-api-access-k57nx\") pod \"swift-proxy-76c998454c-rj46d\" (UID: \"128745d7-e885-48bf-88fe-22bacab590d5\") " pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.337231 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.398405 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ad8b667-e750-405f-9dc9-1e675ee21e58" path="/var/lib/kubelet/pods/8ad8b667-e750-405f-9dc9-1e675ee21e58/volumes" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.430749 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.503472 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-1"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.554042 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-2"] Mar 07 07:31:22 crc kubenswrapper[4738]: W0307 07:31:22.582099 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8c4676c_c7a7_46e8_95d0_ec6656e32fe1.slice/crio-31bec385f5edcfbfdec9362a5d077ba633194ef09a0aab075301faa4849aa151 WatchSource:0}: Error finding container 31bec385f5edcfbfdec9362a5d077ba633194ef09a0aab075301faa4849aa151: Status 404 returned error can't find the container with id 31bec385f5edcfbfdec9362a5d077ba633194ef09a0aab075301faa4849aa151 Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.590264 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-8czg5"] Mar 07 07:31:22 crc kubenswrapper[4738]: W0307 07:31:22.593978 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8f789001_3731_4f60_8a8a_8a41cb5a8ee8.slice/crio-19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53 WatchSource:0}: Error finding container 19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53: Status 404 returned error can't find the container with id 19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53 Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.890084 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-proxy-76c998454c-rj46d"] Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.945859 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"33a1ad1718aa5cc1d3d205c2ca7f4c3810c07ef598d229841e039a75b5b2864c"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.945897 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"ba4177c1a42de9337184de62fb1f681a3b6519df3eb400db2efecdf6bf733208"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.953103 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" event={"ID":"128745d7-e885-48bf-88fe-22bacab590d5","Type":"ContainerStarted","Data":"6441f8c2058ef796830a1c4ec59b0ceebb24c904240937f163c1997cc9af9158"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.957845 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" event={"ID":"8f789001-3731-4f60-8a8a-8a41cb5a8ee8","Type":"ContainerStarted","Data":"c95bd3e78c2b9b5d9084552e5d26c16a807195997687c91749ec028aae53d53c"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.957880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" event={"ID":"8f789001-3731-4f60-8a8a-8a41cb5a8ee8","Type":"ContainerStarted","Data":"19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.972684 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"42af58d656e004aa34c1d7698aff502e555215587d6cde6c08724c1eab604bcd"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.972719 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"31bec385f5edcfbfdec9362a5d077ba633194ef09a0aab075301faa4849aa151"} Mar 07 07:31:22 crc kubenswrapper[4738]: I0307 07:31:22.982900 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" podStartSLOduration=1.9828847760000001 podStartE2EDuration="1.982884776s" podCreationTimestamp="2026-03-07 07:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:22.979287928 +0000 UTC m=+1901.444275259" watchObservedRunningTime="2026-03-07 07:31:22.982884776 +0000 UTC m=+1901.447872097" Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.024065 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" event={"ID":"128745d7-e885-48bf-88fe-22bacab590d5","Type":"ContainerStarted","Data":"8fb505ee7895ae34e9739f831a906f5f186ca9a40d00fa3312876eb868ad4059"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.024498 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" event={"ID":"128745d7-e885-48bf-88fe-22bacab590d5","Type":"ContainerStarted","Data":"cadb1ffa71ebebe626bcd22ea9461d4d464ff31a988a572aad35a9e35ffd8f34"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.024528 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.024543 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.045572 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" podStartSLOduration=2.045554376 podStartE2EDuration="2.045554376s" podCreationTimestamp="2026-03-07 07:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:24.045141965 +0000 UTC m=+1902.510129286" watchObservedRunningTime="2026-03-07 07:31:24.045554376 +0000 UTC m=+1902.510541697" Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056353 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"fe6df0f394e14d95962c8101ae757adc11e79fe2bbce861658f6c8f1f7803e92"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056397 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"a31dfecc191047ec0349b4d88b43fb00c1f05b71433b8ab3a15a590254f88b89"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056410 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"318cfca25dbfaf7cff44dbf49d09e56dad96edb52806f23785581f24ba43cc55"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056418 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"c009ffb88d78322bf56e45f186bc3700436a2a71424f04cd672cd5692beeb666"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056427 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"18d688c2c1f036add9a0bef4724facec7a4e023d7fbe6b0dcd25430953142705"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.056436 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"090f3a2e7d284f853ec10abc66e0768e31973daf6ae484b8c10ff333aa2a8921"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.074510 4738 generic.go:334] "Generic (PLEG): container finished" podID="b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" containerID="492007510e067b8cb79fae26b464af1b5d45fe64ec7f02f6ea32c00ad8edafe9" exitCode=0 Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.074608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" event={"ID":"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a","Type":"ContainerDied","Data":"492007510e067b8cb79fae26b464af1b5d45fe64ec7f02f6ea32c00ad8edafe9"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087651 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"3a5e8be1e78de000b9184720fc46b642c079f74816a915a9f0b439fcd2a775c8"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087688 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"95c446bb0d4591aaaccf5355e1c7d66190160af5161e5adff27289630eb00f15"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087697 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"2bc3fae0dbd804317c4e00cdd9c5e4c7b8430205d8bc82812608100addf33330"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087706 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"280e22ef12db1ec911e24fbae5dfb063b32074d9929034d16fe43c408cf9c07a"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087715 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"1628744bd5be9ae0252f0a17291cd1c61934b61657d874c7dd69a8d0446caa1c"} Mar 07 07:31:24 crc kubenswrapper[4738]: I0307 07:31:24.087723 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"369687233ff9140ab4f0a23d530bc85ab5bb1a7781129cd56181368709221746"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.103446 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"577499bb155cefe37fe316d99d7dd54c97d4b7a820a128b43d5ef187c67b1f83"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.103495 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"70fa875f7f7806c9d3a826ec3613737f4b6af38a9c08ae9e8395e922d6c61063"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.103510 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"9658b760b7bd26bae2c5b86ef9e900440c6d7857aa66b075867821aa66918309"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.103522 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"1c0882956c7f422db97c04f0bdabf53afdd504c855ffe3ab98f2d550f8be2ade"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.103533 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"b5b944f7783a239e0098249d3d699c5f50647db2ef841b215ef8c6767edee88c"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.111070 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"6c218f54b6ffbb920b5aed75bdf473150fc27fac0d923c03c5c67f3b37b76252"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.111124 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"547f408e8d713e4e1c01bd1244fb86f79a4c3458c1911e87888c046da2e05290"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.111135 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"59e785eee8ca456dba8d65059136b44e60c73ad488e2b6b3fbc4058b3fac7624"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.111144 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"33478c986243ce23204d226806e07b543bb378aed7844a93d9774634754a8613"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.111173 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"cb08290fb841213eaf64a607ed0356ab1d53f7a0ef770717068ec3bb44fdb6d0"} Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.444060 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.480436 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s"] Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.488197 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s"] Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560457 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560528 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560575 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560597 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560651 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2j7hl\" (UniqueName: \"kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.560687 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift\") pod \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\" (UID: \"b5bca2d6-5b2a-41ed-ac19-27bdfb42834a\") " Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.561624 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.561922 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.582299 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl" (OuterVolumeSpecName: "kube-api-access-2j7hl") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "kube-api-access-2j7hl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.582499 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts" (OuterVolumeSpecName: "scripts") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.584588 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.589281 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" (UID: "b5bca2d6-5b2a-41ed-ac19-27bdfb42834a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.652744 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:25 crc kubenswrapper[4738]: E0307 07:31:25.653043 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" containerName="swift-ring-rebalance" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.653054 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" containerName="swift-ring-rebalance" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.653224 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" containerName="swift-ring-rebalance" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.653699 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.662867 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.663136 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2j7hl\" (UniqueName: \"kubernetes.io/projected/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-kube-api-access-2j7hl\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.663362 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.663484 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.663599 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.663731 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.664308 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.766984 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.767021 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.767068 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sn6j\" (UniqueName: \"kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.767111 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.767143 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.767177 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.868934 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.869019 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.869055 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.869121 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.869173 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.869252 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sn6j\" (UniqueName: \"kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.870225 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.870738 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.871523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.875342 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.877598 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:25 crc kubenswrapper[4738]: I0307 07:31:25.893708 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sn6j\" (UniqueName: \"kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j\") pod \"swift-ring-rebalance-debug-b7kzk\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.026347 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.130755 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"ff81d767e84fee14da45d89e38441a53af3cbc62cfe7a61ed72995167958917d"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.131033 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"14416d9dbcf32cd444dd4fa2b98b6a8c2f6dfcba27e98368bdf48edd445bd37c"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.131045 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-2" event={"ID":"b8c4676c-c7a7-46e8-95d0-ec6656e32fe1","Type":"ContainerStarted","Data":"3384fd81ecff7059fc3aa538d375f66ecd9b19979ab3cb78e9778e65741b5854"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.132919 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be8eb3880d2ad341d21846570b031e667349cf75947707d8ddf2b018e2b4667c" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.132988 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5ms7s" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.145336 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"89fb071ffc9cd931d6f8591e62a038a9992b81f4a3ca2493ed4843dcf1ec9ed6"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.145371 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"3629da8a01f1e4479cbd3ede7dc94ab97d2390af0e5efe97601bb1b134afba5d"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.145380 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-1" event={"ID":"04bf7f33-d023-4bab-be79-70348bf80391","Type":"ContainerStarted","Data":"4aa3f133cbd8d8a241b4a09113c7820663c9331396062e3c68ca1c77f1a7d851"} Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.195246 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-2" podStartSLOduration=6.195224621 podStartE2EDuration="6.195224621s" podCreationTimestamp="2026-03-07 07:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:26.172374926 +0000 UTC m=+1904.637362267" watchObservedRunningTime="2026-03-07 07:31:26.195224621 +0000 UTC m=+1904.660211962" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.218797 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-1" podStartSLOduration=6.218774475 podStartE2EDuration="6.218774475s" podCreationTimestamp="2026-03-07 07:31:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:26.210254086 +0000 UTC m=+1904.675241417" watchObservedRunningTime="2026-03-07 07:31:26.218774475 +0000 UTC m=+1904.683761816" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.253364 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.253784 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-server" containerID="cri-o://913e842ec4b8fb2ceb7aecd6316998ff118c0efd11bb7125242adca66c43121f" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254183 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-sharder" containerID="cri-o://e2afcb3d5bd6da436cd1d63b989f97736f8b78ab7c1d84e305e383973d5b5418" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254231 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="swift-recon-cron" containerID="cri-o://1e33676e1dfc72460ee74d4f28f334f29bb4af4e0ccf64a2a72424ee26b1be7e" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254265 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="rsync" containerID="cri-o://1a160f2572469e6627e2860f3f697e4574196f2abbf061aa6cf9df71d1e4fac2" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254295 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-expirer" containerID="cri-o://00a1114ced83cdd25422442b4fc0dbeaa45242e22daac40c0a8c217baf840470" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254326 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-updater" containerID="cri-o://5d071eeaee3064a50621d8de7dd7af7b823932293d510432e668f78c38c1c4e7" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254353 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-auditor" containerID="cri-o://52cb8f78219255bab27dd27dfd8cc4e03cfb0253330ed63d3a97072e8745eb04" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254384 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-replicator" containerID="cri-o://9bd4cf4c776e165070f9dc389014feffc9dbf95798d7b06bbaaee8c4e5512986" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254411 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-server" containerID="cri-o://cbf1846742e98ca6b7886c4be2aef198fcd953fc98f719a09f117ff72ec44b3c" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254444 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-updater" containerID="cri-o://2e138c25ff9cc34c446cd31eed6f66670e0905685accd7512401f944467368d4" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254472 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-auditor" containerID="cri-o://2656672c4b40e1b490b16dc369afe4001df59767d99c90164f2927301d09099e" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254504 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-replicator" containerID="cri-o://92d682daa1c81453a10d48543264991f1c2ae8439fa5dbdcf45ea378dea1ca87" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254532 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-server" containerID="cri-o://51c349bc5d55ee2d3fc56b86da54c7c34384bedf3f5fe2689bea28ad7921f033" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254561 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-reaper" containerID="cri-o://416c7fb6624c5ef2dd15d22a6f968a5dca6cebd03950721f6596ca9685d06444" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254588 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-auditor" containerID="cri-o://c3f19f9768659ab0515ccbf7de57cf1e3df945357ed0ef1fe9e486d2c5eb499c" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.254618 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-storage-0" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-replicator" containerID="cri-o://47c03f810320b9a33114fc3c3ba2781d7ab71962a5e28154a225cd2f574bff52" gracePeriod=30 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.330147 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:26 crc kubenswrapper[4738]: W0307 07:31:26.361285 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe6574d1_d5bd_4c13_b6cd_38a348fb0ef8.slice/crio-02178138eeb94e26c194973484d8ac0ea0296e3708f8f6808195f3350082eb07 WatchSource:0}: Error finding container 02178138eeb94e26c194973484d8ac0ea0296e3708f8f6808195f3350082eb07: Status 404 returned error can't find the container with id 02178138eeb94e26c194973484d8ac0ea0296e3708f8f6808195f3350082eb07 Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.396862 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5bca2d6-5b2a-41ed-ac19-27bdfb42834a" path="/var/lib/kubelet/pods/b5bca2d6-5b2a-41ed-ac19-27bdfb42834a/volumes" Mar 07 07:31:26 crc kubenswrapper[4738]: I0307 07:31:26.685150 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.155146 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" event={"ID":"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8","Type":"ContainerStarted","Data":"4a391a9f5b4c8a602b90fe5bf24968452b6b051d227f20539740661404f0a18f"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.155199 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" event={"ID":"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8","Type":"ContainerStarted","Data":"02178138eeb94e26c194973484d8ac0ea0296e3708f8f6808195f3350082eb07"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170673 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="e2afcb3d5bd6da436cd1d63b989f97736f8b78ab7c1d84e305e383973d5b5418" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170722 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="1a160f2572469e6627e2860f3f697e4574196f2abbf061aa6cf9df71d1e4fac2" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170737 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="00a1114ced83cdd25422442b4fc0dbeaa45242e22daac40c0a8c217baf840470" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170751 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="5d071eeaee3064a50621d8de7dd7af7b823932293d510432e668f78c38c1c4e7" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170762 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="52cb8f78219255bab27dd27dfd8cc4e03cfb0253330ed63d3a97072e8745eb04" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170770 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="9bd4cf4c776e165070f9dc389014feffc9dbf95798d7b06bbaaee8c4e5512986" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"e2afcb3d5bd6da436cd1d63b989f97736f8b78ab7c1d84e305e383973d5b5418"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170820 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"1a160f2572469e6627e2860f3f697e4574196f2abbf061aa6cf9df71d1e4fac2"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170841 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"00a1114ced83cdd25422442b4fc0dbeaa45242e22daac40c0a8c217baf840470"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170861 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"5d071eeaee3064a50621d8de7dd7af7b823932293d510432e668f78c38c1c4e7"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170779 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="cbf1846742e98ca6b7886c4be2aef198fcd953fc98f719a09f117ff72ec44b3c" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170894 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="2e138c25ff9cc34c446cd31eed6f66670e0905685accd7512401f944467368d4" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170904 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="2656672c4b40e1b490b16dc369afe4001df59767d99c90164f2927301d09099e" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170912 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="92d682daa1c81453a10d48543264991f1c2ae8439fa5dbdcf45ea378dea1ca87" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170920 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="51c349bc5d55ee2d3fc56b86da54c7c34384bedf3f5fe2689bea28ad7921f033" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170928 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="416c7fb6624c5ef2dd15d22a6f968a5dca6cebd03950721f6596ca9685d06444" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170936 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="c3f19f9768659ab0515ccbf7de57cf1e3df945357ed0ef1fe9e486d2c5eb499c" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170944 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="47c03f810320b9a33114fc3c3ba2781d7ab71962a5e28154a225cd2f574bff52" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170951 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="913e842ec4b8fb2ceb7aecd6316998ff118c0efd11bb7125242adca66c43121f" exitCode=0 Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"52cb8f78219255bab27dd27dfd8cc4e03cfb0253330ed63d3a97072e8745eb04"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.170991 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"9bd4cf4c776e165070f9dc389014feffc9dbf95798d7b06bbaaee8c4e5512986"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171010 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"cbf1846742e98ca6b7886c4be2aef198fcd953fc98f719a09f117ff72ec44b3c"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171022 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"2e138c25ff9cc34c446cd31eed6f66670e0905685accd7512401f944467368d4"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171035 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"2656672c4b40e1b490b16dc369afe4001df59767d99c90164f2927301d09099e"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171048 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"92d682daa1c81453a10d48543264991f1c2ae8439fa5dbdcf45ea378dea1ca87"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171060 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"51c349bc5d55ee2d3fc56b86da54c7c34384bedf3f5fe2689bea28ad7921f033"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171072 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"416c7fb6624c5ef2dd15d22a6f968a5dca6cebd03950721f6596ca9685d06444"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171083 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"c3f19f9768659ab0515ccbf7de57cf1e3df945357ed0ef1fe9e486d2c5eb499c"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171094 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"47c03f810320b9a33114fc3c3ba2781d7ab71962a5e28154a225cd2f574bff52"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.171105 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"913e842ec4b8fb2ceb7aecd6316998ff118c0efd11bb7125242adca66c43121f"} Mar 07 07:31:27 crc kubenswrapper[4738]: I0307 07:31:27.191653 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" podStartSLOduration=2.191615368 podStartE2EDuration="2.191615368s" podCreationTimestamp="2026-03-07 07:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:27.177036876 +0000 UTC m=+1905.642024197" watchObservedRunningTime="2026-03-07 07:31:27.191615368 +0000 UTC m=+1905.656602699" Mar 07 07:31:28 crc kubenswrapper[4738]: I0307 07:31:28.179442 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" podUID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" containerName="swift-ring-rebalance" containerID="cri-o://4a391a9f5b4c8a602b90fe5bf24968452b6b051d227f20539740661404f0a18f" gracePeriod=30 Mar 07 07:31:31 crc kubenswrapper[4738]: I0307 07:31:31.385325 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:31:31 crc kubenswrapper[4738]: E0307 07:31:31.385996 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.209830 4738 generic.go:334] "Generic (PLEG): container finished" podID="8f789001-3731-4f60-8a8a-8a41cb5a8ee8" containerID="c95bd3e78c2b9b5d9084552e5d26c16a807195997687c91749ec028aae53d53c" exitCode=0 Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.209929 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" event={"ID":"8f789001-3731-4f60-8a8a-8a41cb5a8ee8","Type":"ContainerDied","Data":"c95bd3e78c2b9b5d9084552e5d26c16a807195997687c91749ec028aae53d53c"} Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.436126 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.437214 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="swift-kuttl-tests/swift-proxy-76c998454c-rj46d" Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.513898 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.514431 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-server" containerID="cri-o://fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" gracePeriod=30 Mar 07 07:31:32 crc kubenswrapper[4738]: I0307 07:31:32.515604 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" containerID="cri-o://9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" gracePeriod=30 Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.173739 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.218757 4738 generic.go:334] "Generic (PLEG): container finished" podID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerID="fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" exitCode=0 Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.218784 4738 generic.go:334] "Generic (PLEG): container finished" podID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerID="9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" exitCode=0 Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.218926 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.219486 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerDied","Data":"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9"} Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.219515 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerDied","Data":"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635"} Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.219527 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-proxy-6ff54dd47f-45755" event={"ID":"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6","Type":"ContainerDied","Data":"0938d1bf50cf639582a3eab54a65c671da05634d9a6809d4bf096b7d2d359823"} Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.219542 4738 scope.go:117] "RemoveContainer" containerID="fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.248878 4738 scope.go:117] "RemoveContainer" containerID="9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.270772 4738 scope.go:117] "RemoveContainer" containerID="fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" Mar 07 07:31:33 crc kubenswrapper[4738]: E0307 07:31:33.271094 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9\": container with ID starting with fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9 not found: ID does not exist" containerID="fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.271120 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9"} err="failed to get container status \"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9\": rpc error: code = NotFound desc = could not find container \"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9\": container with ID starting with fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9 not found: ID does not exist" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.271140 4738 scope.go:117] "RemoveContainer" containerID="9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" Mar 07 07:31:33 crc kubenswrapper[4738]: E0307 07:31:33.271541 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635\": container with ID starting with 9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635 not found: ID does not exist" containerID="9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.271568 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635"} err="failed to get container status \"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635\": rpc error: code = NotFound desc = could not find container \"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635\": container with ID starting with 9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635 not found: ID does not exist" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.271587 4738 scope.go:117] "RemoveContainer" containerID="fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.272320 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9"} err="failed to get container status \"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9\": rpc error: code = NotFound desc = could not find container \"fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9\": container with ID starting with fa467708bce3e5bc479e5c6b27863ee40320f52b1b25c9e61d00f0d0990862b9 not found: ID does not exist" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.272370 4738 scope.go:117] "RemoveContainer" containerID="9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.273139 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635"} err="failed to get container status \"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635\": rpc error: code = NotFound desc = could not find container \"9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635\": container with ID starting with 9a61d2fb75e703d55904a51e0bbc21a54988fb6c6ec4e611299696259010a635 not found: ID does not exist" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.277560 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2\") pod \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.277714 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd\") pod \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.277751 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift\") pod \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.277824 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data\") pod \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.277889 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd\") pod \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\" (UID: \"77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.278197 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" (UID: "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.278335 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" (UID: "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.282891 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" (UID: "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.298985 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2" (OuterVolumeSpecName: "kube-api-access-qxtc2") pod "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" (UID: "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6"). InnerVolumeSpecName "kube-api-access-qxtc2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.349327 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data" (OuterVolumeSpecName: "config-data") pod "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" (UID: "77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.379146 4738 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.379320 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qxtc2\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-kube-api-access-qxtc2\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.379382 4738 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.379437 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.379493 4738 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6-config-data\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.446203 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.579975 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.582608 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.582791 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.582922 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.582956 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.582988 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktk2r\" (UniqueName: \"kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.583032 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts\") pod \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\" (UID: \"8f789001-3731-4f60-8a8a-8a41cb5a8ee8\") " Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.584849 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.587063 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.589468 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r" (OuterVolumeSpecName: "kube-api-access-ktk2r") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "kube-api-access-ktk2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.590662 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-proxy-6ff54dd47f-45755"] Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.603659 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.607915 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts" (OuterVolumeSpecName: "scripts") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.608523 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8f789001-3731-4f60-8a8a-8a41cb5a8ee8" (UID: "8f789001-3731-4f60-8a8a-8a41cb5a8ee8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685042 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685430 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685572 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktk2r\" (UniqueName: \"kubernetes.io/projected/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-kube-api-access-ktk2r\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685696 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685825 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:33 crc kubenswrapper[4738]: I0307 07:31:33.685939 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8f789001-3731-4f60-8a8a-8a41cb5a8ee8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:34 crc kubenswrapper[4738]: I0307 07:31:34.227689 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" event={"ID":"8f789001-3731-4f60-8a8a-8a41cb5a8ee8","Type":"ContainerDied","Data":"19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53"} Mar 07 07:31:34 crc kubenswrapper[4738]: I0307 07:31:34.227728 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f7cae3a59be4b0593b0c2bf86be4743b0bc2b3e28edfa57de911c7912e9c53" Mar 07 07:31:34 crc kubenswrapper[4738]: I0307 07:31:34.227742 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-8czg5" Mar 07 07:31:34 crc kubenswrapper[4738]: I0307 07:31:34.395071 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" path="/var/lib/kubelet/pods/77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6/volumes" Mar 07 07:31:44 crc kubenswrapper[4738]: I0307 07:31:44.385791 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:31:44 crc kubenswrapper[4738]: E0307 07:31:44.386559 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.390478 4738 generic.go:334] "Generic (PLEG): container finished" podID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" containerID="4a391a9f5b4c8a602b90fe5bf24968452b6b051d227f20539740661404f0a18f" exitCode=1 Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.390563 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" event={"ID":"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8","Type":"ContainerDied","Data":"4a391a9f5b4c8a602b90fe5bf24968452b6b051d227f20539740661404f0a18f"} Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.588696 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711217 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711322 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711347 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711396 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711444 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.711517 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sn6j\" (UniqueName: \"kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j\") pod \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\" (UID: \"be6574d1-d5bd-4c13-b6cd-38a348fb0ef8\") " Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.712041 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.712809 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.717235 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j" (OuterVolumeSpecName: "kube-api-access-9sn6j") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "kube-api-access-9sn6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.733222 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.733674 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.737077 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts" (OuterVolumeSpecName: "scripts") pod "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" (UID: "be6574d1-d5bd-4c13-b6cd-38a348fb0ef8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.803313 4738 scope.go:117] "RemoveContainer" containerID="72d5bd0ef9fcd9187362469aecc1431d05095186221ab138da11de71511415d2" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813446 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813476 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813485 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813496 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813508 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.813521 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sn6j\" (UniqueName: \"kubernetes.io/projected/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8-kube-api-access-9sn6j\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.967697 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:45 crc kubenswrapper[4738]: I0307 07:31:45.973614 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk"] Mar 07 07:31:46 crc kubenswrapper[4738]: I0307 07:31:46.399043 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" path="/var/lib/kubelet/pods/be6574d1-d5bd-4c13-b6cd-38a348fb0ef8/volumes" Mar 07 07:31:46 crc kubenswrapper[4738]: I0307 07:31:46.402413 4738 scope.go:117] "RemoveContainer" containerID="4a391a9f5b4c8a602b90fe5bf24968452b6b051d227f20539740661404f0a18f" Mar 07 07:31:46 crc kubenswrapper[4738]: I0307 07:31:46.402500 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b7kzk" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.152654 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twr9s"] Mar 07 07:31:47 crc kubenswrapper[4738]: E0307 07:31:47.153302 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153326 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" Mar 07 07:31:47 crc kubenswrapper[4738]: E0307 07:31:47.153341 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-server" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153350 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-server" Mar 07 07:31:47 crc kubenswrapper[4738]: E0307 07:31:47.153370 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f789001-3731-4f60-8a8a-8a41cb5a8ee8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153378 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f789001-3731-4f60-8a8a-8a41cb5a8ee8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: E0307 07:31:47.153398 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153404 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153552 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-server" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153571 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6574d1-d5bd-4c13-b6cd-38a348fb0ef8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153583 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a99f0c-5b1d-489e-8ccf-5daa0a8a20a6" containerName="proxy-httpd" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.153593 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f789001-3731-4f60-8a8a-8a41cb5a8ee8" containerName="swift-ring-rebalance" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.154104 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.156855 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.160374 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twr9s"] Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.160694 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.334979 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.335037 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.335335 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.335407 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88w2\" (UniqueName: \"kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.335462 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.335493 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.436825 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.436884 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88w2\" (UniqueName: \"kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.436927 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.436948 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.436998 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.437025 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.437335 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.437971 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.438425 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.443586 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.443929 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.463459 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88w2\" (UniqueName: \"kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2\") pod \"swift-ring-rebalance-debug-twr9s\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.471382 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:31:47 crc kubenswrapper[4738]: I0307 07:31:47.743520 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twr9s"] Mar 07 07:31:48 crc kubenswrapper[4738]: I0307 07:31:48.438273 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" event={"ID":"b0d2335c-f082-492a-951f-580e2012a0c0","Type":"ContainerStarted","Data":"804d9e3e7c04a503f67ef04c1aacfa86a98005e3b91b55935dc1a47e60327a46"} Mar 07 07:31:48 crc kubenswrapper[4738]: I0307 07:31:48.438600 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" event={"ID":"b0d2335c-f082-492a-951f-580e2012a0c0","Type":"ContainerStarted","Data":"9bbb9cd57dd07d3e3d0ebcae65cd2954b2cf71dcc3ff66fab6b5b220fb134910"} Mar 07 07:31:48 crc kubenswrapper[4738]: I0307 07:31:48.457923 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" podStartSLOduration=1.45789832 podStartE2EDuration="1.45789832s" podCreationTimestamp="2026-03-07 07:31:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:31:48.453919353 +0000 UTC m=+1926.918906704" watchObservedRunningTime="2026-03-07 07:31:48.45789832 +0000 UTC m=+1926.922885651" Mar 07 07:31:55 crc kubenswrapper[4738]: I0307 07:31:55.385177 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:31:55 crc kubenswrapper[4738]: E0307 07:31:55.385843 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.506373 4738 generic.go:334] "Generic (PLEG): container finished" podID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerID="1e33676e1dfc72460ee74d4f28f334f29bb4af4e0ccf64a2a72424ee26b1be7e" exitCode=137 Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.506454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"1e33676e1dfc72460ee74d4f28f334f29bb4af4e0ccf64a2a72424ee26b1be7e"} Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.630313 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.782645 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") pod \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.782744 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock\") pod \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.782786 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache\") pod \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.782851 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc2mc\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc\") pod \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.782903 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\" (UID: \"723fe892-48fa-4502-bd9f-2c07ed1c3dc7\") " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.783261 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock" (OuterVolumeSpecName: "lock") pod "723fe892-48fa-4502-bd9f-2c07ed1c3dc7" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.783436 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache" (OuterVolumeSpecName: "cache") pod "723fe892-48fa-4502-bd9f-2c07ed1c3dc7" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.790514 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "swift") pod "723fe892-48fa-4502-bd9f-2c07ed1c3dc7" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.790731 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc" (OuterVolumeSpecName: "kube-api-access-tc2mc") pod "723fe892-48fa-4502-bd9f-2c07ed1c3dc7" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7"). InnerVolumeSpecName "kube-api-access-tc2mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.791000 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "723fe892-48fa-4502-bd9f-2c07ed1c3dc7" (UID: "723fe892-48fa-4502-bd9f-2c07ed1c3dc7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.885011 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.885042 4738 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-lock\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.885051 4738 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-cache\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.885060 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc2mc\" (UniqueName: \"kubernetes.io/projected/723fe892-48fa-4502-bd9f-2c07ed1c3dc7-kube-api-access-tc2mc\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.885094 4738 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.900660 4738 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 07 07:31:56 crc kubenswrapper[4738]: I0307 07:31:56.987985 4738 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.520981 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"723fe892-48fa-4502-bd9f-2c07ed1c3dc7","Type":"ContainerDied","Data":"ff6243cf5ab07b63292f7e6730ff539588914d49b60e4da27285db5f28becb16"} Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.521336 4738 scope.go:117] "RemoveContainer" containerID="e2afcb3d5bd6da436cd1d63b989f97736f8b78ab7c1d84e305e383973d5b5418" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.521533 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.556195 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.561083 4738 scope.go:117] "RemoveContainer" containerID="1e33676e1dfc72460ee74d4f28f334f29bb4af4e0ccf64a2a72424ee26b1be7e" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.564351 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.580919 4738 scope.go:117] "RemoveContainer" containerID="1a160f2572469e6627e2860f3f697e4574196f2abbf061aa6cf9df71d1e4fac2" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.605538 4738 scope.go:117] "RemoveContainer" containerID="00a1114ced83cdd25422442b4fc0dbeaa45242e22daac40c0a8c217baf840470" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.615631 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616061 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616083 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616100 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-sharder" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616112 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-sharder" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616130 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616142 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-server" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616187 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616199 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616216 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616226 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616245 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="swift-recon-cron" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616255 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="swift-recon-cron" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616270 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616280 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616296 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616307 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-server" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616329 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-expirer" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616340 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-expirer" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616374 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616386 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616436 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-reaper" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616448 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-reaper" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616469 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616481 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616500 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616510 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-server" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616524 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="rsync" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616535 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="rsync" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616554 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616566 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: E0307 07:31:57.616584 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616596 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616845 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616868 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-sharder" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616893 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616912 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-expirer" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616929 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616944 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616961 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="swift-recon-cron" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616979 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.616995 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617008 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-updater" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617019 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-reaper" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617034 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="container-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617048 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-server" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617069 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="account-auditor" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617087 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="rsync" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.617102 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" containerName="object-replicator" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.624522 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.637462 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.647511 4738 scope.go:117] "RemoveContainer" containerID="5d071eeaee3064a50621d8de7dd7af7b823932293d510432e668f78c38c1c4e7" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.673374 4738 scope.go:117] "RemoveContainer" containerID="52cb8f78219255bab27dd27dfd8cc4e03cfb0253330ed63d3a97072e8745eb04" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.697391 4738 scope.go:117] "RemoveContainer" containerID="9bd4cf4c776e165070f9dc389014feffc9dbf95798d7b06bbaaee8c4e5512986" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.713566 4738 scope.go:117] "RemoveContainer" containerID="cbf1846742e98ca6b7886c4be2aef198fcd953fc98f719a09f117ff72ec44b3c" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.736417 4738 scope.go:117] "RemoveContainer" containerID="2e138c25ff9cc34c446cd31eed6f66670e0905685accd7512401f944467368d4" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.751586 4738 scope.go:117] "RemoveContainer" containerID="2656672c4b40e1b490b16dc369afe4001df59767d99c90164f2927301d09099e" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.767342 4738 scope.go:117] "RemoveContainer" containerID="92d682daa1c81453a10d48543264991f1c2ae8439fa5dbdcf45ea378dea1ca87" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.788185 4738 scope.go:117] "RemoveContainer" containerID="51c349bc5d55ee2d3fc56b86da54c7c34384bedf3f5fe2689bea28ad7921f033" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.804130 4738 scope.go:117] "RemoveContainer" containerID="416c7fb6624c5ef2dd15d22a6f968a5dca6cebd03950721f6596ca9685d06444" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.805684 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6mz\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-kube-api-access-7n6mz\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.805941 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-etc-swift\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.805982 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-lock\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.806016 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-cache\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.806139 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.822595 4738 scope.go:117] "RemoveContainer" containerID="c3f19f9768659ab0515ccbf7de57cf1e3df945357ed0ef1fe9e486d2c5eb499c" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.840688 4738 scope.go:117] "RemoveContainer" containerID="47c03f810320b9a33114fc3c3ba2781d7ab71962a5e28154a225cd2f574bff52" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.855993 4738 scope.go:117] "RemoveContainer" containerID="913e842ec4b8fb2ceb7aecd6316998ff118c0efd11bb7125242adca66c43121f" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908027 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-etc-swift\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908075 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-lock\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908107 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-cache\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908182 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908221 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6mz\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-kube-api-access-7n6mz\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908611 4738 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") device mount path \"/mnt/openstack/pv07\"" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.908959 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-cache\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.910712 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/4cca6511-1923-4d3b-9354-5f35fd664d64-lock\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.921098 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-etc-swift\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.944874 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6mz\" (UniqueName: \"kubernetes.io/projected/4cca6511-1923-4d3b-9354-5f35fd664d64-kube-api-access-7n6mz\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:57 crc kubenswrapper[4738]: I0307 07:31:57.946914 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"swift-storage-0\" (UID: \"4cca6511-1923-4d3b-9354-5f35fd664d64\") " pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:58 crc kubenswrapper[4738]: I0307 07:31:58.252221 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-storage-0" Mar 07 07:31:58 crc kubenswrapper[4738]: I0307 07:31:58.394250 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="723fe892-48fa-4502-bd9f-2c07ed1c3dc7" path="/var/lib/kubelet/pods/723fe892-48fa-4502-bd9f-2c07ed1c3dc7/volumes" Mar 07 07:31:58 crc kubenswrapper[4738]: I0307 07:31:58.686818 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-storage-0"] Mar 07 07:31:58 crc kubenswrapper[4738]: W0307 07:31:58.694878 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cca6511_1923_4d3b_9354_5f35fd664d64.slice/crio-1feb2d627ba98177550c289527e804376601db60128f298d9aad9045a952aa57 WatchSource:0}: Error finding container 1feb2d627ba98177550c289527e804376601db60128f298d9aad9045a952aa57: Status 404 returned error can't find the container with id 1feb2d627ba98177550c289527e804376601db60128f298d9aad9045a952aa57 Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552446 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"542ebe4c655c273e413bbdb5cbbd6a1664a2b37fc4da8f0b93146e0ecdb72be2"} Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552790 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"319a8b90a0cb262124cec330403526b9c937dcd2f9243ef8ce0f598eb386e584"} Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552815 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"3804278fceb76ff5a19c804e11e95aaba521b56593fce1e4a50f7ca4507485a9"} Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552829 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"ceecc0cbc5557d286fe7ce2a1b733f24ca83c4bb870975c0041a1ee491ac70f7"} Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552841 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"0ebbef42bfa0cd6475effdf1dfe85a00acfc6f3b16fcac121cafff05a788fc22"} Mar 07 07:31:59 crc kubenswrapper[4738]: I0307 07:31:59.552854 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"1feb2d627ba98177550c289527e804376601db60128f298d9aad9045a952aa57"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.138043 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547812-bpwrs"] Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.140764 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.143398 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.143807 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.144066 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.149028 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-bpwrs"] Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.244010 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcrpj\" (UniqueName: \"kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj\") pod \"auto-csr-approver-29547812-bpwrs\" (UID: \"efb3d826-81d0-4364-8e2d-8f6987c5b01d\") " pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.345941 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcrpj\" (UniqueName: \"kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj\") pod \"auto-csr-approver-29547812-bpwrs\" (UID: \"efb3d826-81d0-4364-8e2d-8f6987c5b01d\") " pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.372892 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcrpj\" (UniqueName: \"kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj\") pod \"auto-csr-approver-29547812-bpwrs\" (UID: \"efb3d826-81d0-4364-8e2d-8f6987c5b01d\") " pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.482673 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572506 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"39627c7e17a55ac4a9565ada2bf8e56b32933682e140bf86435f8c187e93eaee"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572666 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"3933867ec5d021bb876828506051b0d8e268e071540233f278d17e66c4ac997c"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572777 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"8f2eac88114cbe061fd7d1e14d2c01b78d33451d611613c9e9eb129ec26fe9f8"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572863 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"42e9e7382222dddf8f8be400ad258f352b26a75f2af52adaa50cab64d7f6cf0f"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572937 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"42a695b2d992b8eb569fc1d996f2f8a2f19c5866233c3abba3224be5c841193d"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.572994 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"4adbb5e520ecd0f14e075043e28cb24ecaa6676cdba12022d30487d1bba9e331"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.573067 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"2812c6f5aa33eb25d8e922e6b442836b67aafc8b73c2d981090b03efa5fcc539"} Mar 07 07:32:00 crc kubenswrapper[4738]: I0307 07:32:00.996882 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-bpwrs"] Mar 07 07:32:01 crc kubenswrapper[4738]: W0307 07:32:01.000671 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefb3d826_81d0_4364_8e2d_8f6987c5b01d.slice/crio-b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6 WatchSource:0}: Error finding container b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6: Status 404 returned error can't find the container with id b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6 Mar 07 07:32:01 crc kubenswrapper[4738]: I0307 07:32:01.588320 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"a72c68efb766c337dc9defd2b1ad06881a4ff06c504cadd0dcbfde847e5533ec"} Mar 07 07:32:01 crc kubenswrapper[4738]: I0307 07:32:01.589397 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"dfcc6acf85bec25d60bcd6b84b79416866cb20bfc16abcb7b7e33dbd97b5e8a3"} Mar 07 07:32:01 crc kubenswrapper[4738]: I0307 07:32:01.589468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-storage-0" event={"ID":"4cca6511-1923-4d3b-9354-5f35fd664d64","Type":"ContainerStarted","Data":"954de452be4dbed020e22fbf1248554574890517caaeab753f8a36eba8cdf0d0"} Mar 07 07:32:01 crc kubenswrapper[4738]: I0307 07:32:01.589698 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" event={"ID":"efb3d826-81d0-4364-8e2d-8f6987c5b01d","Type":"ContainerStarted","Data":"b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6"} Mar 07 07:32:01 crc kubenswrapper[4738]: I0307 07:32:01.630822 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-storage-0" podStartSLOduration=4.630804564 podStartE2EDuration="4.630804564s" podCreationTimestamp="2026-03-07 07:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:01.625956313 +0000 UTC m=+1940.090943684" watchObservedRunningTime="2026-03-07 07:32:01.630804564 +0000 UTC m=+1940.095791885" Mar 07 07:32:02 crc kubenswrapper[4738]: I0307 07:32:02.599274 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" event={"ID":"efb3d826-81d0-4364-8e2d-8f6987c5b01d","Type":"ContainerStarted","Data":"6036dccf516273422d89097117e7260946f593754499f5e83ae63896ad481b34"} Mar 07 07:32:02 crc kubenswrapper[4738]: I0307 07:32:02.616981 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" podStartSLOduration=1.3521596869999999 podStartE2EDuration="2.616956685s" podCreationTimestamp="2026-03-07 07:32:00 +0000 UTC" firstStartedPulling="2026-03-07 07:32:01.003900277 +0000 UTC m=+1939.468887588" lastFinishedPulling="2026-03-07 07:32:02.268697265 +0000 UTC m=+1940.733684586" observedRunningTime="2026-03-07 07:32:02.613331477 +0000 UTC m=+1941.078318808" watchObservedRunningTime="2026-03-07 07:32:02.616956685 +0000 UTC m=+1941.081944016" Mar 07 07:32:03 crc kubenswrapper[4738]: I0307 07:32:03.608421 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" event={"ID":"efb3d826-81d0-4364-8e2d-8f6987c5b01d","Type":"ContainerDied","Data":"6036dccf516273422d89097117e7260946f593754499f5e83ae63896ad481b34"} Mar 07 07:32:03 crc kubenswrapper[4738]: I0307 07:32:03.609829 4738 generic.go:334] "Generic (PLEG): container finished" podID="efb3d826-81d0-4364-8e2d-8f6987c5b01d" containerID="6036dccf516273422d89097117e7260946f593754499f5e83ae63896ad481b34" exitCode=0 Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.040259 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.125320 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcrpj\" (UniqueName: \"kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj\") pod \"efb3d826-81d0-4364-8e2d-8f6987c5b01d\" (UID: \"efb3d826-81d0-4364-8e2d-8f6987c5b01d\") " Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.132737 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj" (OuterVolumeSpecName: "kube-api-access-xcrpj") pod "efb3d826-81d0-4364-8e2d-8f6987c5b01d" (UID: "efb3d826-81d0-4364-8e2d-8f6987c5b01d"). InnerVolumeSpecName "kube-api-access-xcrpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.226887 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcrpj\" (UniqueName: \"kubernetes.io/projected/efb3d826-81d0-4364-8e2d-8f6987c5b01d-kube-api-access-xcrpj\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.468064 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-mb6pt"] Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.475515 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547806-mb6pt"] Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.631479 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" event={"ID":"efb3d826-81d0-4364-8e2d-8f6987c5b01d","Type":"ContainerDied","Data":"b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6"} Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.631845 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7600f5033908b395527bb752627d199f9a08e260119faa543f629ee4035a0a6" Mar 07 07:32:05 crc kubenswrapper[4738]: I0307 07:32:05.631580 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547812-bpwrs" Mar 07 07:32:06 crc kubenswrapper[4738]: I0307 07:32:06.396872 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a051bc9b-524f-4efb-94c5-06f7cfe2d2cb" path="/var/lib/kubelet/pods/a051bc9b-524f-4efb-94c5-06f7cfe2d2cb/volumes" Mar 07 07:32:07 crc kubenswrapper[4738]: I0307 07:32:07.649398 4738 generic.go:334] "Generic (PLEG): container finished" podID="b0d2335c-f082-492a-951f-580e2012a0c0" containerID="804d9e3e7c04a503f67ef04c1aacfa86a98005e3b91b55935dc1a47e60327a46" exitCode=0 Mar 07 07:32:07 crc kubenswrapper[4738]: I0307 07:32:07.649454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" event={"ID":"b0d2335c-f082-492a-951f-580e2012a0c0","Type":"ContainerDied","Data":"804d9e3e7c04a503f67ef04c1aacfa86a98005e3b91b55935dc1a47e60327a46"} Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.386262 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:32:08 crc kubenswrapper[4738]: E0307 07:32:08.386945 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.937272 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.966737 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twr9s"] Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.977755 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-twr9s"] Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986530 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986599 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986643 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88w2\" (UniqueName: \"kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986716 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986763 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift\") pod \"b0d2335c-f082-492a-951f-580e2012a0c0\" (UID: \"b0d2335c-f082-492a-951f-580e2012a0c0\") " Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.986974 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.987272 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.987981 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:08 crc kubenswrapper[4738]: I0307 07:32:08.998289 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2" (OuterVolumeSpecName: "kube-api-access-n88w2") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "kube-api-access-n88w2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.012291 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.014420 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.014784 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts" (OuterVolumeSpecName: "scripts") pod "b0d2335c-f082-492a-951f-580e2012a0c0" (UID: "b0d2335c-f082-492a-951f-580e2012a0c0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.088714 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.088751 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88w2\" (UniqueName: \"kubernetes.io/projected/b0d2335c-f082-492a-951f-580e2012a0c0-kube-api-access-n88w2\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.088762 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b0d2335c-f082-492a-951f-580e2012a0c0-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.088771 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b0d2335c-f082-492a-951f-580e2012a0c0-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.088781 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b0d2335c-f082-492a-951f-580e2012a0c0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.378870 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-btbzw"] Mar 07 07:32:09 crc kubenswrapper[4738]: E0307 07:32:09.379515 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0d2335c-f082-492a-951f-580e2012a0c0" containerName="swift-ring-rebalance" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.379537 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0d2335c-f082-492a-951f-580e2012a0c0" containerName="swift-ring-rebalance" Mar 07 07:32:09 crc kubenswrapper[4738]: E0307 07:32:09.379563 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb3d826-81d0-4364-8e2d-8f6987c5b01d" containerName="oc" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.379569 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb3d826-81d0-4364-8e2d-8f6987c5b01d" containerName="oc" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.379712 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0d2335c-f082-492a-951f-580e2012a0c0" containerName="swift-ring-rebalance" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.379724 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb3d826-81d0-4364-8e2d-8f6987c5b01d" containerName="oc" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.380189 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.403414 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-btbzw"] Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.493582 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.493658 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.493817 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.494055 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2j66\" (UniqueName: \"kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.494104 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.494195 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596240 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596324 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596394 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596469 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596537 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.596626 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2j66\" (UniqueName: \"kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.597292 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.597484 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.597842 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.601315 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.601817 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.629003 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2j66\" (UniqueName: \"kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66\") pod \"swift-ring-rebalance-debug-btbzw\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.663396 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9bbb9cd57dd07d3e3d0ebcae65cd2954b2cf71dcc3ff66fab6b5b220fb134910" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.663463 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-twr9s" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.702212 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:09 crc kubenswrapper[4738]: I0307 07:32:09.928240 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-btbzw"] Mar 07 07:32:09 crc kubenswrapper[4738]: W0307 07:32:09.932244 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841e934d_6446_4776_9d10_4ce3848d6d4a.slice/crio-390b1f2f98efa8fdacb10f40c979535b8141860b0a4a1a549b8d69508478eb24 WatchSource:0}: Error finding container 390b1f2f98efa8fdacb10f40c979535b8141860b0a4a1a549b8d69508478eb24: Status 404 returned error can't find the container with id 390b1f2f98efa8fdacb10f40c979535b8141860b0a4a1a549b8d69508478eb24 Mar 07 07:32:10 crc kubenswrapper[4738]: I0307 07:32:10.399585 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0d2335c-f082-492a-951f-580e2012a0c0" path="/var/lib/kubelet/pods/b0d2335c-f082-492a-951f-580e2012a0c0/volumes" Mar 07 07:32:10 crc kubenswrapper[4738]: I0307 07:32:10.675412 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" event={"ID":"841e934d-6446-4776-9d10-4ce3848d6d4a","Type":"ContainerStarted","Data":"4a2d958aa533a5e510557c8fefeefec4d81ad2c944e19cb36d07a5102435935b"} Mar 07 07:32:10 crc kubenswrapper[4738]: I0307 07:32:10.676074 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" event={"ID":"841e934d-6446-4776-9d10-4ce3848d6d4a","Type":"ContainerStarted","Data":"390b1f2f98efa8fdacb10f40c979535b8141860b0a4a1a549b8d69508478eb24"} Mar 07 07:32:10 crc kubenswrapper[4738]: I0307 07:32:10.699944 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" podStartSLOduration=1.6999243800000001 podStartE2EDuration="1.69992438s" podCreationTimestamp="2026-03-07 07:32:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:10.698734618 +0000 UTC m=+1949.163721959" watchObservedRunningTime="2026-03-07 07:32:10.69992438 +0000 UTC m=+1949.164911701" Mar 07 07:32:11 crc kubenswrapper[4738]: I0307 07:32:11.684141 4738 generic.go:334] "Generic (PLEG): container finished" podID="841e934d-6446-4776-9d10-4ce3848d6d4a" containerID="4a2d958aa533a5e510557c8fefeefec4d81ad2c944e19cb36d07a5102435935b" exitCode=0 Mar 07 07:32:11 crc kubenswrapper[4738]: I0307 07:32:11.684209 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" event={"ID":"841e934d-6446-4776-9d10-4ce3848d6d4a","Type":"ContainerDied","Data":"4a2d958aa533a5e510557c8fefeefec4d81ad2c944e19cb36d07a5102435935b"} Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.068008 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.141367 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-btbzw"] Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.146548 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-btbzw"] Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.172693 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.172739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.172920 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.172975 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2j66\" (UniqueName: \"kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.173008 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.173057 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift\") pod \"841e934d-6446-4776-9d10-4ce3848d6d4a\" (UID: \"841e934d-6446-4776-9d10-4ce3848d6d4a\") " Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.173391 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.179071 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66" (OuterVolumeSpecName: "kube-api-access-l2j66") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "kube-api-access-l2j66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.181640 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.194481 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts" (OuterVolumeSpecName: "scripts") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.198065 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.212254 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "841e934d-6446-4776-9d10-4ce3848d6d4a" (UID: "841e934d-6446-4776-9d10-4ce3848d6d4a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274906 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2j66\" (UniqueName: \"kubernetes.io/projected/841e934d-6446-4776-9d10-4ce3848d6d4a-kube-api-access-l2j66\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274945 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274953 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/841e934d-6446-4776-9d10-4ce3848d6d4a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274964 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274973 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/841e934d-6446-4776-9d10-4ce3848d6d4a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.274981 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/841e934d-6446-4776-9d10-4ce3848d6d4a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.702460 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="390b1f2f98efa8fdacb10f40c979535b8141860b0a4a1a549b8d69508478eb24" Mar 07 07:32:13 crc kubenswrapper[4738]: I0307 07:32:13.702542 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-btbzw" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.306698 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t779h"] Mar 07 07:32:14 crc kubenswrapper[4738]: E0307 07:32:14.307483 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="841e934d-6446-4776-9d10-4ce3848d6d4a" containerName="swift-ring-rebalance" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.307507 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="841e934d-6446-4776-9d10-4ce3848d6d4a" containerName="swift-ring-rebalance" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.307903 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="841e934d-6446-4776-9d10-4ce3848d6d4a" containerName="swift-ring-rebalance" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.308747 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.313566 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.313984 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.332226 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t779h"] Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.389941 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.390026 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.390097 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.390122 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.390149 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.390192 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nzxs\" (UniqueName: \"kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.394860 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="841e934d-6446-4776-9d10-4ce3848d6d4a" path="/var/lib/kubelet/pods/841e934d-6446-4776-9d10-4ce3848d6d4a/volumes" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491496 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491544 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491580 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491602 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nzxs\" (UniqueName: \"kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491694 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.491751 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.492267 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.492451 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.493088 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.495646 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.497120 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.517939 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nzxs\" (UniqueName: \"kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs\") pod \"swift-ring-rebalance-debug-t779h\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.635906 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:14 crc kubenswrapper[4738]: I0307 07:32:14.854089 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t779h"] Mar 07 07:32:14 crc kubenswrapper[4738]: W0307 07:32:14.858121 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94a0271a_15d8_4513_89db_107877bee125.slice/crio-489b4208fe603b958200547967915020f8961f62ce290906c7f9f8fc070e078e WatchSource:0}: Error finding container 489b4208fe603b958200547967915020f8961f62ce290906c7f9f8fc070e078e: Status 404 returned error can't find the container with id 489b4208fe603b958200547967915020f8961f62ce290906c7f9f8fc070e078e Mar 07 07:32:15 crc kubenswrapper[4738]: I0307 07:32:15.721562 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" event={"ID":"94a0271a-15d8-4513-89db-107877bee125","Type":"ContainerStarted","Data":"2434ebe447a78d852cc640b82d3bc70eb265b9858ab1df95f1aa08a1cc45cb92"} Mar 07 07:32:15 crc kubenswrapper[4738]: I0307 07:32:15.721864 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" event={"ID":"94a0271a-15d8-4513-89db-107877bee125","Type":"ContainerStarted","Data":"489b4208fe603b958200547967915020f8961f62ce290906c7f9f8fc070e078e"} Mar 07 07:32:15 crc kubenswrapper[4738]: I0307 07:32:15.759882 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" podStartSLOduration=1.759859494 podStartE2EDuration="1.759859494s" podCreationTimestamp="2026-03-07 07:32:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:15.751927099 +0000 UTC m=+1954.216914500" watchObservedRunningTime="2026-03-07 07:32:15.759859494 +0000 UTC m=+1954.224846835" Mar 07 07:32:16 crc kubenswrapper[4738]: I0307 07:32:16.730869 4738 generic.go:334] "Generic (PLEG): container finished" podID="94a0271a-15d8-4513-89db-107877bee125" containerID="2434ebe447a78d852cc640b82d3bc70eb265b9858ab1df95f1aa08a1cc45cb92" exitCode=0 Mar 07 07:32:16 crc kubenswrapper[4738]: I0307 07:32:16.730928 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" event={"ID":"94a0271a-15d8-4513-89db-107877bee125","Type":"ContainerDied","Data":"2434ebe447a78d852cc640b82d3bc70eb265b9858ab1df95f1aa08a1cc45cb92"} Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.073352 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.103288 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t779h"] Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.106239 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t779h"] Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.144644 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nzxs\" (UniqueName: \"kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.144685 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.144742 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.144764 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.145524 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.145857 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.145969 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf\") pod \"94a0271a-15d8-4513-89db-107877bee125\" (UID: \"94a0271a-15d8-4513-89db-107877bee125\") " Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.145976 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.146393 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/94a0271a-15d8-4513-89db-107877bee125-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.146477 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.158185 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs" (OuterVolumeSpecName: "kube-api-access-8nzxs") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "kube-api-access-8nzxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.164472 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.164868 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.173825 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts" (OuterVolumeSpecName: "scripts") pod "94a0271a-15d8-4513-89db-107877bee125" (UID: "94a0271a-15d8-4513-89db-107877bee125"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.247648 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.247683 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nzxs\" (UniqueName: \"kubernetes.io/projected/94a0271a-15d8-4513-89db-107877bee125-kube-api-access-8nzxs\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.247694 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/94a0271a-15d8-4513-89db-107877bee125-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.247702 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/94a0271a-15d8-4513-89db-107877bee125-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.403670 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94a0271a-15d8-4513-89db-107877bee125" path="/var/lib/kubelet/pods/94a0271a-15d8-4513-89db-107877bee125/volumes" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.749851 4738 scope.go:117] "RemoveContainer" containerID="2434ebe447a78d852cc640b82d3bc70eb265b9858ab1df95f1aa08a1cc45cb92" Mar 07 07:32:18 crc kubenswrapper[4738]: I0307 07:32:18.749865 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t779h" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.298776 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd"] Mar 07 07:32:19 crc kubenswrapper[4738]: E0307 07:32:19.299618 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94a0271a-15d8-4513-89db-107877bee125" containerName="swift-ring-rebalance" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.299709 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="94a0271a-15d8-4513-89db-107877bee125" containerName="swift-ring-rebalance" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.299911 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="94a0271a-15d8-4513-89db-107877bee125" containerName="swift-ring-rebalance" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.300532 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.302697 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.302954 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.310245 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd"] Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.363928 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.363985 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.364032 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdqx2\" (UniqueName: \"kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.364056 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.364149 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.364272 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.385541 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:32:19 crc kubenswrapper[4738]: E0307 07:32:19.385831 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466530 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466667 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466758 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdqx2\" (UniqueName: \"kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466797 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466830 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.466919 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.467045 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.467386 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.468640 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.471807 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.472603 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.497189 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdqx2\" (UniqueName: \"kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2\") pod \"swift-ring-rebalance-debug-nn7gd\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:19 crc kubenswrapper[4738]: I0307 07:32:19.630382 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:20 crc kubenswrapper[4738]: I0307 07:32:20.213070 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd"] Mar 07 07:32:20 crc kubenswrapper[4738]: W0307 07:32:20.231686 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc378ce79_0a90_486d_9b71_5a4b4ebace55.slice/crio-c80d879378b690df4670706d326d933a69ba8f1baaf0111efb058a59ed5e9921 WatchSource:0}: Error finding container c80d879378b690df4670706d326d933a69ba8f1baaf0111efb058a59ed5e9921: Status 404 returned error can't find the container with id c80d879378b690df4670706d326d933a69ba8f1baaf0111efb058a59ed5e9921 Mar 07 07:32:20 crc kubenswrapper[4738]: I0307 07:32:20.768454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" event={"ID":"c378ce79-0a90-486d-9b71-5a4b4ebace55","Type":"ContainerStarted","Data":"44f0086f8ce78ef36b7cf4982011f04e7ae52e20447e502761e5e0b272872eac"} Mar 07 07:32:20 crc kubenswrapper[4738]: I0307 07:32:20.768798 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" event={"ID":"c378ce79-0a90-486d-9b71-5a4b4ebace55","Type":"ContainerStarted","Data":"c80d879378b690df4670706d326d933a69ba8f1baaf0111efb058a59ed5e9921"} Mar 07 07:32:20 crc kubenswrapper[4738]: I0307 07:32:20.789287 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" podStartSLOduration=1.789270871 podStartE2EDuration="1.789270871s" podCreationTimestamp="2026-03-07 07:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:20.787880182 +0000 UTC m=+1959.252867513" watchObservedRunningTime="2026-03-07 07:32:20.789270871 +0000 UTC m=+1959.254258192" Mar 07 07:32:22 crc kubenswrapper[4738]: I0307 07:32:22.786966 4738 generic.go:334] "Generic (PLEG): container finished" podID="c378ce79-0a90-486d-9b71-5a4b4ebace55" containerID="44f0086f8ce78ef36b7cf4982011f04e7ae52e20447e502761e5e0b272872eac" exitCode=0 Mar 07 07:32:22 crc kubenswrapper[4738]: I0307 07:32:22.787101 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" event={"ID":"c378ce79-0a90-486d-9b71-5a4b4ebace55","Type":"ContainerDied","Data":"44f0086f8ce78ef36b7cf4982011f04e7ae52e20447e502761e5e0b272872eac"} Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.086479 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.119707 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd"] Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.125776 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd"] Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.170763 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdqx2\" (UniqueName: \"kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.170810 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.170832 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.170855 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.170883 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.171631 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.171840 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.172092 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.172125 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.176730 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2" (OuterVolumeSpecName: "kube-api-access-sdqx2") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "kube-api-access-sdqx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.190972 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts" (OuterVolumeSpecName: "scripts") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: E0307 07:32:24.191268 4738 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf podName:c378ce79-0a90-486d-9b71-5a4b4ebace55 nodeName:}" failed. No retries permitted until 2026-03-07 07:32:24.691244316 +0000 UTC m=+1963.156231637 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "swiftconf" (UniqueName: "kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55") : error deleting /var/lib/kubelet/pods/c378ce79-0a90-486d-9b71-5a4b4ebace55/volume-subpaths: remove /var/lib/kubelet/pods/c378ce79-0a90-486d-9b71-5a4b4ebace55/volume-subpaths: no such file or directory Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.192702 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.273186 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdqx2\" (UniqueName: \"kubernetes.io/projected/c378ce79-0a90-486d-9b71-5a4b4ebace55-kube-api-access-sdqx2\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.273219 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c378ce79-0a90-486d-9b71-5a4b4ebace55-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.273228 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c378ce79-0a90-486d-9b71-5a4b4ebace55-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.273237 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.780509 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") pod \"c378ce79-0a90-486d-9b71-5a4b4ebace55\" (UID: \"c378ce79-0a90-486d-9b71-5a4b4ebace55\") " Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.783824 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c378ce79-0a90-486d-9b71-5a4b4ebace55" (UID: "c378ce79-0a90-486d-9b71-5a4b4ebace55"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.801623 4738 scope.go:117] "RemoveContainer" containerID="44f0086f8ce78ef36b7cf4982011f04e7ae52e20447e502761e5e0b272872eac" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.801719 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nn7gd" Mar 07 07:32:24 crc kubenswrapper[4738]: I0307 07:32:24.883011 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c378ce79-0a90-486d-9b71-5a4b4ebace55-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.336031 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx"] Mar 07 07:32:25 crc kubenswrapper[4738]: E0307 07:32:25.336305 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c378ce79-0a90-486d-9b71-5a4b4ebace55" containerName="swift-ring-rebalance" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.336318 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c378ce79-0a90-486d-9b71-5a4b4ebace55" containerName="swift-ring-rebalance" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.336476 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="c378ce79-0a90-486d-9b71-5a4b4ebace55" containerName="swift-ring-rebalance" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.336933 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.341128 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.341239 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.348879 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx"] Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390552 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390626 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k49ml\" (UniqueName: \"kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390669 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390776 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.390859 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492315 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492391 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492474 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492547 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492588 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k49ml\" (UniqueName: \"kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.492635 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.493504 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.493877 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.493995 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.497140 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.509477 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.510368 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k49ml\" (UniqueName: \"kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml\") pod \"swift-ring-rebalance-debug-hxbwx\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:25 crc kubenswrapper[4738]: I0307 07:32:25.657825 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:26 crc kubenswrapper[4738]: I0307 07:32:26.152697 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx"] Mar 07 07:32:26 crc kubenswrapper[4738]: W0307 07:32:26.163306 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f1eb014_9492_48df_9ce2_02e900d04064.slice/crio-92a016b1d78fbb823eb1d2d8c5b2a15e04262513cc86650aa640250d2f947753 WatchSource:0}: Error finding container 92a016b1d78fbb823eb1d2d8c5b2a15e04262513cc86650aa640250d2f947753: Status 404 returned error can't find the container with id 92a016b1d78fbb823eb1d2d8c5b2a15e04262513cc86650aa640250d2f947753 Mar 07 07:32:26 crc kubenswrapper[4738]: I0307 07:32:26.398127 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c378ce79-0a90-486d-9b71-5a4b4ebace55" path="/var/lib/kubelet/pods/c378ce79-0a90-486d-9b71-5a4b4ebace55/volumes" Mar 07 07:32:26 crc kubenswrapper[4738]: I0307 07:32:26.832128 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" event={"ID":"0f1eb014-9492-48df-9ce2-02e900d04064","Type":"ContainerStarted","Data":"a0de4886164ab4df2fed28ccdf3992f4ef09a3915ab97a1bf78cb75eb2a6f990"} Mar 07 07:32:26 crc kubenswrapper[4738]: I0307 07:32:26.832182 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" event={"ID":"0f1eb014-9492-48df-9ce2-02e900d04064","Type":"ContainerStarted","Data":"92a016b1d78fbb823eb1d2d8c5b2a15e04262513cc86650aa640250d2f947753"} Mar 07 07:32:26 crc kubenswrapper[4738]: I0307 07:32:26.888734 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" podStartSLOduration=1.8887199460000001 podStartE2EDuration="1.888719946s" podCreationTimestamp="2026-03-07 07:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:26.884662275 +0000 UTC m=+1965.349649606" watchObservedRunningTime="2026-03-07 07:32:26.888719946 +0000 UTC m=+1965.353707267" Mar 07 07:32:28 crc kubenswrapper[4738]: I0307 07:32:28.853181 4738 generic.go:334] "Generic (PLEG): container finished" podID="0f1eb014-9492-48df-9ce2-02e900d04064" containerID="a0de4886164ab4df2fed28ccdf3992f4ef09a3915ab97a1bf78cb75eb2a6f990" exitCode=0 Mar 07 07:32:28 crc kubenswrapper[4738]: I0307 07:32:28.853301 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" event={"ID":"0f1eb014-9492-48df-9ce2-02e900d04064","Type":"ContainerDied","Data":"a0de4886164ab4df2fed28ccdf3992f4ef09a3915ab97a1bf78cb75eb2a6f990"} Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.269934 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.309582 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx"] Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.318817 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx"] Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.378843 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.378961 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.379007 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.379067 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k49ml\" (UniqueName: \"kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.379276 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.379371 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts\") pod \"0f1eb014-9492-48df-9ce2-02e900d04064\" (UID: \"0f1eb014-9492-48df-9ce2-02e900d04064\") " Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.379962 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.380006 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.384906 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml" (OuterVolumeSpecName: "kube-api-access-k49ml") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "kube-api-access-k49ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.416906 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.418050 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts" (OuterVolumeSpecName: "scripts") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.419584 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0f1eb014-9492-48df-9ce2-02e900d04064" (UID: "0f1eb014-9492-48df-9ce2-02e900d04064"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.481925 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k49ml\" (UniqueName: \"kubernetes.io/projected/0f1eb014-9492-48df-9ce2-02e900d04064-kube-api-access-k49ml\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.481959 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0f1eb014-9492-48df-9ce2-02e900d04064-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.481972 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.481983 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.481995 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0f1eb014-9492-48df-9ce2-02e900d04064-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.482006 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0f1eb014-9492-48df-9ce2-02e900d04064-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.873869 4738 scope.go:117] "RemoveContainer" containerID="a0de4886164ab4df2fed28ccdf3992f4ef09a3915ab97a1bf78cb75eb2a6f990" Mar 07 07:32:30 crc kubenswrapper[4738]: I0307 07:32:30.873953 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hxbwx" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.514375 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j69hg"] Mar 07 07:32:31 crc kubenswrapper[4738]: E0307 07:32:31.514772 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f1eb014-9492-48df-9ce2-02e900d04064" containerName="swift-ring-rebalance" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.514789 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f1eb014-9492-48df-9ce2-02e900d04064" containerName="swift-ring-rebalance" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.515011 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f1eb014-9492-48df-9ce2-02e900d04064" containerName="swift-ring-rebalance" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.515624 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.518335 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.518345 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.530073 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j69hg"] Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600582 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600647 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600669 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600746 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600792 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.600817 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhsg5\" (UniqueName: \"kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702530 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702646 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702698 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhsg5\" (UniqueName: \"kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702770 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702830 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.702862 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.704432 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.704582 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.705373 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.708278 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.709423 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.726362 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhsg5\" (UniqueName: \"kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5\") pod \"swift-ring-rebalance-debug-j69hg\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:31 crc kubenswrapper[4738]: I0307 07:32:31.841129 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:32 crc kubenswrapper[4738]: I0307 07:32:32.418203 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f1eb014-9492-48df-9ce2-02e900d04064" path="/var/lib/kubelet/pods/0f1eb014-9492-48df-9ce2-02e900d04064/volumes" Mar 07 07:32:32 crc kubenswrapper[4738]: I0307 07:32:32.424267 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j69hg"] Mar 07 07:32:32 crc kubenswrapper[4738]: I0307 07:32:32.894572 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" event={"ID":"140d1d0e-03d1-4c4e-aa66-451cc97b5656","Type":"ContainerStarted","Data":"1c98c1b4e6c3d1f67d18fe1b86b4abcac3ad6407f555880bf2c9be9e7789d590"} Mar 07 07:32:32 crc kubenswrapper[4738]: I0307 07:32:32.894903 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" event={"ID":"140d1d0e-03d1-4c4e-aa66-451cc97b5656","Type":"ContainerStarted","Data":"a1a1da211a172c52631e2fc000222702e50469806c12d2fb1323f9c4ebfabce0"} Mar 07 07:32:32 crc kubenswrapper[4738]: I0307 07:32:32.911877 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" podStartSLOduration=1.911857254 podStartE2EDuration="1.911857254s" podCreationTimestamp="2026-03-07 07:32:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:32.90837709 +0000 UTC m=+1971.373364421" watchObservedRunningTime="2026-03-07 07:32:32.911857254 +0000 UTC m=+1971.376844575" Mar 07 07:32:34 crc kubenswrapper[4738]: I0307 07:32:34.386058 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:32:34 crc kubenswrapper[4738]: E0307 07:32:34.386720 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:32:34 crc kubenswrapper[4738]: I0307 07:32:34.912114 4738 generic.go:334] "Generic (PLEG): container finished" podID="140d1d0e-03d1-4c4e-aa66-451cc97b5656" containerID="1c98c1b4e6c3d1f67d18fe1b86b4abcac3ad6407f555880bf2c9be9e7789d590" exitCode=0 Mar 07 07:32:34 crc kubenswrapper[4738]: I0307 07:32:34.912195 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" event={"ID":"140d1d0e-03d1-4c4e-aa66-451cc97b5656","Type":"ContainerDied","Data":"1c98c1b4e6c3d1f67d18fe1b86b4abcac3ad6407f555880bf2c9be9e7789d590"} Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.247862 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.280070 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j69hg"] Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.286845 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j69hg"] Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396462 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396713 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396750 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396783 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396800 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhsg5\" (UniqueName: \"kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.396845 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf\") pod \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\" (UID: \"140d1d0e-03d1-4c4e-aa66-451cc97b5656\") " Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.397576 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.397834 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.403024 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5" (OuterVolumeSpecName: "kube-api-access-mhsg5") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "kube-api-access-mhsg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.421106 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts" (OuterVolumeSpecName: "scripts") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.429609 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.442816 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "140d1d0e-03d1-4c4e-aa66-451cc97b5656" (UID: "140d1d0e-03d1-4c4e-aa66-451cc97b5656"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.497989 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.498023 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.498139 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/140d1d0e-03d1-4c4e-aa66-451cc97b5656-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.498173 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/140d1d0e-03d1-4c4e-aa66-451cc97b5656-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.498183 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhsg5\" (UniqueName: \"kubernetes.io/projected/140d1d0e-03d1-4c4e-aa66-451cc97b5656-kube-api-access-mhsg5\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.498194 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/140d1d0e-03d1-4c4e-aa66-451cc97b5656-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.939250 4738 scope.go:117] "RemoveContainer" containerID="1c98c1b4e6c3d1f67d18fe1b86b4abcac3ad6407f555880bf2c9be9e7789d590" Mar 07 07:32:36 crc kubenswrapper[4738]: I0307 07:32:36.939370 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j69hg" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.510708 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4"] Mar 07 07:32:37 crc kubenswrapper[4738]: E0307 07:32:37.511095 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140d1d0e-03d1-4c4e-aa66-451cc97b5656" containerName="swift-ring-rebalance" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.511112 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="140d1d0e-03d1-4c4e-aa66-451cc97b5656" containerName="swift-ring-rebalance" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.511341 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="140d1d0e-03d1-4c4e-aa66-451cc97b5656" containerName="swift-ring-rebalance" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.511954 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.513991 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.514207 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.519957 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4"] Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.613440 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.613530 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.613978 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.614016 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.614287 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klq79\" (UniqueName: \"kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.614474 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.715912 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.716039 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.716086 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.716199 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klq79\" (UniqueName: \"kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.716274 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.716313 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.717714 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.717821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.718683 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.726843 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.726858 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.742181 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klq79\" (UniqueName: \"kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79\") pod \"swift-ring-rebalance-debug-n6nd4\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:37 crc kubenswrapper[4738]: I0307 07:32:37.829992 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:38 crc kubenswrapper[4738]: I0307 07:32:38.058573 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4"] Mar 07 07:32:38 crc kubenswrapper[4738]: W0307 07:32:38.072305 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaeea3195_8f50_4500_951d_128f5032b066.slice/crio-ef6b61f88eddafb289584b973723abbbfaf813d3efba4d4aaa8bc7642ec503ad WatchSource:0}: Error finding container ef6b61f88eddafb289584b973723abbbfaf813d3efba4d4aaa8bc7642ec503ad: Status 404 returned error can't find the container with id ef6b61f88eddafb289584b973723abbbfaf813d3efba4d4aaa8bc7642ec503ad Mar 07 07:32:38 crc kubenswrapper[4738]: I0307 07:32:38.396772 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140d1d0e-03d1-4c4e-aa66-451cc97b5656" path="/var/lib/kubelet/pods/140d1d0e-03d1-4c4e-aa66-451cc97b5656/volumes" Mar 07 07:32:38 crc kubenswrapper[4738]: I0307 07:32:38.963664 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" event={"ID":"aeea3195-8f50-4500-951d-128f5032b066","Type":"ContainerStarted","Data":"d3f99dcec3b3cf845a344418fc48b7c6f27393dad5fc0da1bea1fe962412cf64"} Mar 07 07:32:38 crc kubenswrapper[4738]: I0307 07:32:38.964545 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" event={"ID":"aeea3195-8f50-4500-951d-128f5032b066","Type":"ContainerStarted","Data":"ef6b61f88eddafb289584b973723abbbfaf813d3efba4d4aaa8bc7642ec503ad"} Mar 07 07:32:38 crc kubenswrapper[4738]: I0307 07:32:38.993498 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" podStartSLOduration=1.993481407 podStartE2EDuration="1.993481407s" podCreationTimestamp="2026-03-07 07:32:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:38.989939761 +0000 UTC m=+1977.454927092" watchObservedRunningTime="2026-03-07 07:32:38.993481407 +0000 UTC m=+1977.458468738" Mar 07 07:32:39 crc kubenswrapper[4738]: I0307 07:32:39.975896 4738 generic.go:334] "Generic (PLEG): container finished" podID="aeea3195-8f50-4500-951d-128f5032b066" containerID="d3f99dcec3b3cf845a344418fc48b7c6f27393dad5fc0da1bea1fe962412cf64" exitCode=0 Mar 07 07:32:39 crc kubenswrapper[4738]: I0307 07:32:39.975994 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" event={"ID":"aeea3195-8f50-4500-951d-128f5032b066","Type":"ContainerDied","Data":"d3f99dcec3b3cf845a344418fc48b7c6f27393dad5fc0da1bea1fe962412cf64"} Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.265311 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295487 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295561 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295637 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295767 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klq79\" (UniqueName: \"kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.295857 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices\") pod \"aeea3195-8f50-4500-951d-128f5032b066\" (UID: \"aeea3195-8f50-4500-951d-128f5032b066\") " Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.296689 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.296845 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.302493 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79" (OuterVolumeSpecName: "kube-api-access-klq79") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "kube-api-access-klq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.314664 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4"] Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.318493 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts" (OuterVolumeSpecName: "scripts") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.320202 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.321059 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4"] Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.330365 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "aeea3195-8f50-4500-951d-128f5032b066" (UID: "aeea3195-8f50-4500-951d-128f5032b066"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397374 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397416 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/aeea3195-8f50-4500-951d-128f5032b066-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397428 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397436 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/aeea3195-8f50-4500-951d-128f5032b066-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397445 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klq79\" (UniqueName: \"kubernetes.io/projected/aeea3195-8f50-4500-951d-128f5032b066-kube-api-access-klq79\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.397454 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/aeea3195-8f50-4500-951d-128f5032b066-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.992778 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6b61f88eddafb289584b973723abbbfaf813d3efba4d4aaa8bc7642ec503ad" Mar 07 07:32:41 crc kubenswrapper[4738]: I0307 07:32:41.992898 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-n6nd4" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.404927 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aeea3195-8f50-4500-951d-128f5032b066" path="/var/lib/kubelet/pods/aeea3195-8f50-4500-951d-128f5032b066/volumes" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.467904 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj"] Mar 07 07:32:42 crc kubenswrapper[4738]: E0307 07:32:42.483066 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aeea3195-8f50-4500-951d-128f5032b066" containerName="swift-ring-rebalance" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.483130 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="aeea3195-8f50-4500-951d-128f5032b066" containerName="swift-ring-rebalance" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.483385 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="aeea3195-8f50-4500-951d-128f5032b066" containerName="swift-ring-rebalance" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.483905 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj"] Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.484071 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.487026 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.487239 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520277 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mlcz\" (UniqueName: \"kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520358 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520396 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520503 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520574 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.520672 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622109 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622175 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622221 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622263 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mlcz\" (UniqueName: \"kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622289 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622306 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.622648 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.624374 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.624797 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.631634 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.631871 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.641726 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mlcz\" (UniqueName: \"kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz\") pod \"swift-ring-rebalance-debug-7qqjj\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:42 crc kubenswrapper[4738]: I0307 07:32:42.815310 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:43 crc kubenswrapper[4738]: I0307 07:32:43.294108 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj"] Mar 07 07:32:43 crc kubenswrapper[4738]: W0307 07:32:43.298304 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode81b8bf9_2fc3_4679_bcc1_978997372f96.slice/crio-3d2bbe15481a37ffdfffeb21ae5872622aaddf756b204e310990f5c74ba916d9 WatchSource:0}: Error finding container 3d2bbe15481a37ffdfffeb21ae5872622aaddf756b204e310990f5c74ba916d9: Status 404 returned error can't find the container with id 3d2bbe15481a37ffdfffeb21ae5872622aaddf756b204e310990f5c74ba916d9 Mar 07 07:32:44 crc kubenswrapper[4738]: I0307 07:32:44.013039 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" event={"ID":"e81b8bf9-2fc3-4679-bcc1-978997372f96","Type":"ContainerStarted","Data":"e32bd244bb34a37b1b79e1c4299e9cfa9c424ad82ffcc2c6cb47421fb03e3a22"} Mar 07 07:32:44 crc kubenswrapper[4738]: I0307 07:32:44.013423 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" event={"ID":"e81b8bf9-2fc3-4679-bcc1-978997372f96","Type":"ContainerStarted","Data":"3d2bbe15481a37ffdfffeb21ae5872622aaddf756b204e310990f5c74ba916d9"} Mar 07 07:32:44 crc kubenswrapper[4738]: I0307 07:32:44.041477 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" podStartSLOduration=2.041458278 podStartE2EDuration="2.041458278s" podCreationTimestamp="2026-03-07 07:32:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:44.037479889 +0000 UTC m=+1982.502467210" watchObservedRunningTime="2026-03-07 07:32:44.041458278 +0000 UTC m=+1982.506445599" Mar 07 07:32:45 crc kubenswrapper[4738]: I0307 07:32:45.020631 4738 generic.go:334] "Generic (PLEG): container finished" podID="e81b8bf9-2fc3-4679-bcc1-978997372f96" containerID="e32bd244bb34a37b1b79e1c4299e9cfa9c424ad82ffcc2c6cb47421fb03e3a22" exitCode=0 Mar 07 07:32:45 crc kubenswrapper[4738]: I0307 07:32:45.020678 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" event={"ID":"e81b8bf9-2fc3-4679-bcc1-978997372f96","Type":"ContainerDied","Data":"e32bd244bb34a37b1b79e1c4299e9cfa9c424ad82ffcc2c6cb47421fb03e3a22"} Mar 07 07:32:45 crc kubenswrapper[4738]: I0307 07:32:45.893521 4738 scope.go:117] "RemoveContainer" containerID="0e720457c242f88560492c41c224f50e15dda393fb14314e4d0aee6533f8e980" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.339286 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.370986 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj"] Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.384417 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj"] Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.385402 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:32:46 crc kubenswrapper[4738]: E0307 07:32:46.387464 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480347 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480396 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mlcz\" (UniqueName: \"kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480422 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480440 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480507 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.480549 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf\") pod \"e81b8bf9-2fc3-4679-bcc1-978997372f96\" (UID: \"e81b8bf9-2fc3-4679-bcc1-978997372f96\") " Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.481569 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.481626 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.487407 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz" (OuterVolumeSpecName: "kube-api-access-5mlcz") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "kube-api-access-5mlcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.513128 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts" (OuterVolumeSpecName: "scripts") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.520774 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.521182 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e81b8bf9-2fc3-4679-bcc1-978997372f96" (UID: "e81b8bf9-2fc3-4679-bcc1-978997372f96"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583041 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583077 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e81b8bf9-2fc3-4679-bcc1-978997372f96-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583087 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mlcz\" (UniqueName: \"kubernetes.io/projected/e81b8bf9-2fc3-4679-bcc1-978997372f96-kube-api-access-5mlcz\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583097 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583108 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e81b8bf9-2fc3-4679-bcc1-978997372f96-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:46 crc kubenswrapper[4738]: I0307 07:32:46.583126 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e81b8bf9-2fc3-4679-bcc1-978997372f96-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.037835 4738 scope.go:117] "RemoveContainer" containerID="e32bd244bb34a37b1b79e1c4299e9cfa9c424ad82ffcc2c6cb47421fb03e3a22" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.037891 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7qqjj" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.564930 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v"] Mar 07 07:32:47 crc kubenswrapper[4738]: E0307 07:32:47.565449 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81b8bf9-2fc3-4679-bcc1-978997372f96" containerName="swift-ring-rebalance" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.565461 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81b8bf9-2fc3-4679-bcc1-978997372f96" containerName="swift-ring-rebalance" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.565588 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81b8bf9-2fc3-4679-bcc1-978997372f96" containerName="swift-ring-rebalance" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.566049 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.568315 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.568484 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.579444 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v"] Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698136 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698279 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698358 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698464 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96l5b\" (UniqueName: \"kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698536 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.698731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806258 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96l5b\" (UniqueName: \"kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806322 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806409 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806461 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806519 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.807050 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.806539 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.807372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.807584 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.811906 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.813590 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.853847 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96l5b\" (UniqueName: \"kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b\") pod \"swift-ring-rebalance-debug-pwb7v\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:47 crc kubenswrapper[4738]: I0307 07:32:47.879767 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:48 crc kubenswrapper[4738]: I0307 07:32:48.168017 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v"] Mar 07 07:32:48 crc kubenswrapper[4738]: W0307 07:32:48.169358 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79b8d0fc_8e59_441b_ba94_6984ee28be95.slice/crio-abb4ee3a537e9e524592559fc613e66a306be068ebdd8154d7991083fb8190bd WatchSource:0}: Error finding container abb4ee3a537e9e524592559fc613e66a306be068ebdd8154d7991083fb8190bd: Status 404 returned error can't find the container with id abb4ee3a537e9e524592559fc613e66a306be068ebdd8154d7991083fb8190bd Mar 07 07:32:48 crc kubenswrapper[4738]: I0307 07:32:48.394392 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81b8bf9-2fc3-4679-bcc1-978997372f96" path="/var/lib/kubelet/pods/e81b8bf9-2fc3-4679-bcc1-978997372f96/volumes" Mar 07 07:32:49 crc kubenswrapper[4738]: I0307 07:32:49.058572 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" event={"ID":"79b8d0fc-8e59-441b-ba94-6984ee28be95","Type":"ContainerStarted","Data":"e2a40bf195c6b9012453b43ad2e4fbe24b063ee0c7b348081e0cc5a07d7e7328"} Mar 07 07:32:49 crc kubenswrapper[4738]: I0307 07:32:49.058913 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" event={"ID":"79b8d0fc-8e59-441b-ba94-6984ee28be95","Type":"ContainerStarted","Data":"abb4ee3a537e9e524592559fc613e66a306be068ebdd8154d7991083fb8190bd"} Mar 07 07:32:49 crc kubenswrapper[4738]: I0307 07:32:49.078219 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" podStartSLOduration=2.078203483 podStartE2EDuration="2.078203483s" podCreationTimestamp="2026-03-07 07:32:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:49.075227632 +0000 UTC m=+1987.540214953" watchObservedRunningTime="2026-03-07 07:32:49.078203483 +0000 UTC m=+1987.543190804" Mar 07 07:32:51 crc kubenswrapper[4738]: I0307 07:32:51.075675 4738 generic.go:334] "Generic (PLEG): container finished" podID="79b8d0fc-8e59-441b-ba94-6984ee28be95" containerID="e2a40bf195c6b9012453b43ad2e4fbe24b063ee0c7b348081e0cc5a07d7e7328" exitCode=0 Mar 07 07:32:51 crc kubenswrapper[4738]: I0307 07:32:51.075791 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" event={"ID":"79b8d0fc-8e59-441b-ba94-6984ee28be95","Type":"ContainerDied","Data":"e2a40bf195c6b9012453b43ad2e4fbe24b063ee0c7b348081e0cc5a07d7e7328"} Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.360310 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.483789 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.483919 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.483941 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.483968 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.484005 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96l5b\" (UniqueName: \"kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.484025 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf\") pod \"79b8d0fc-8e59-441b-ba94-6984ee28be95\" (UID: \"79b8d0fc-8e59-441b-ba94-6984ee28be95\") " Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.484724 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.484791 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.488706 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b" (OuterVolumeSpecName: "kube-api-access-96l5b") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "kube-api-access-96l5b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.500701 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts" (OuterVolumeSpecName: "scripts") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.516498 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.519403 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "79b8d0fc-8e59-441b-ba94-6984ee28be95" (UID: "79b8d0fc-8e59-441b-ba94-6984ee28be95"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.572388 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v"] Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.577719 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v"] Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585707 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585727 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/79b8d0fc-8e59-441b-ba94-6984ee28be95-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585757 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585766 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/79b8d0fc-8e59-441b-ba94-6984ee28be95-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585775 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96l5b\" (UniqueName: \"kubernetes.io/projected/79b8d0fc-8e59-441b-ba94-6984ee28be95-kube-api-access-96l5b\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:52 crc kubenswrapper[4738]: I0307 07:32:52.585784 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/79b8d0fc-8e59-441b-ba94-6984ee28be95-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.093077 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abb4ee3a537e9e524592559fc613e66a306be068ebdd8154d7991083fb8190bd" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.093531 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pwb7v" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.778736 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8"] Mar 07 07:32:53 crc kubenswrapper[4738]: E0307 07:32:53.779414 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79b8d0fc-8e59-441b-ba94-6984ee28be95" containerName="swift-ring-rebalance" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.779430 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="79b8d0fc-8e59-441b-ba94-6984ee28be95" containerName="swift-ring-rebalance" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.779609 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="79b8d0fc-8e59-441b-ba94-6984ee28be95" containerName="swift-ring-rebalance" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.780237 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.784971 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.786349 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.806211 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8"] Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905445 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905501 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905523 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfwcg\" (UniqueName: \"kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905550 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905797 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:53 crc kubenswrapper[4738]: I0307 07:32:53.905836 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007226 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007276 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007336 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007371 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007397 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfwcg\" (UniqueName: \"kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.007423 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.008233 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.008544 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.008712 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.012635 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.013599 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.030821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfwcg\" (UniqueName: \"kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg\") pod \"swift-ring-rebalance-debug-cv2c8\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.098481 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.381484 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8"] Mar 07 07:32:54 crc kubenswrapper[4738]: I0307 07:32:54.399803 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79b8d0fc-8e59-441b-ba94-6984ee28be95" path="/var/lib/kubelet/pods/79b8d0fc-8e59-441b-ba94-6984ee28be95/volumes" Mar 07 07:32:55 crc kubenswrapper[4738]: I0307 07:32:55.108953 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" event={"ID":"0b764a92-4693-41e8-9b3b-5e4aad70a28e","Type":"ContainerStarted","Data":"be04b1a066ae7dd9303878b6bd39ac5173216d92b1ec98b92dbd4723b96918df"} Mar 07 07:32:55 crc kubenswrapper[4738]: I0307 07:32:55.109461 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" event={"ID":"0b764a92-4693-41e8-9b3b-5e4aad70a28e","Type":"ContainerStarted","Data":"8fba2ad269097637f16b1afcc66f2f45bca2acfa7c3f0bde791f50bbc6bb54be"} Mar 07 07:32:55 crc kubenswrapper[4738]: I0307 07:32:55.139914 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" podStartSLOduration=2.139894446 podStartE2EDuration="2.139894446s" podCreationTimestamp="2026-03-07 07:32:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:32:55.128184468 +0000 UTC m=+1993.593171799" watchObservedRunningTime="2026-03-07 07:32:55.139894446 +0000 UTC m=+1993.604881767" Mar 07 07:32:56 crc kubenswrapper[4738]: I0307 07:32:56.117450 4738 generic.go:334] "Generic (PLEG): container finished" podID="0b764a92-4693-41e8-9b3b-5e4aad70a28e" containerID="be04b1a066ae7dd9303878b6bd39ac5173216d92b1ec98b92dbd4723b96918df" exitCode=0 Mar 07 07:32:56 crc kubenswrapper[4738]: I0307 07:32:56.117498 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" event={"ID":"0b764a92-4693-41e8-9b3b-5e4aad70a28e","Type":"ContainerDied","Data":"be04b1a066ae7dd9303878b6bd39ac5173216d92b1ec98b92dbd4723b96918df"} Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.386543 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:32:57 crc kubenswrapper[4738]: E0307 07:32:57.387147 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.510957 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.558016 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8"] Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559601 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559682 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559707 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559794 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfwcg\" (UniqueName: \"kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559833 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.559851 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift\") pod \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\" (UID: \"0b764a92-4693-41e8-9b3b-5e4aad70a28e\") " Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.560843 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.564247 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8"] Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.566414 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.574596 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg" (OuterVolumeSpecName: "kube-api-access-gfwcg") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "kube-api-access-gfwcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.596146 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts" (OuterVolumeSpecName: "scripts") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.606072 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.611091 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0b764a92-4693-41e8-9b3b-5e4aad70a28e" (UID: "0b764a92-4693-41e8-9b3b-5e4aad70a28e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662361 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662418 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662439 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfwcg\" (UniqueName: \"kubernetes.io/projected/0b764a92-4693-41e8-9b3b-5e4aad70a28e-kube-api-access-gfwcg\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662464 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b764a92-4693-41e8-9b3b-5e4aad70a28e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662483 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b764a92-4693-41e8-9b3b-5e4aad70a28e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:57 crc kubenswrapper[4738]: I0307 07:32:57.662500 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b764a92-4693-41e8-9b3b-5e4aad70a28e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.142906 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fba2ad269097637f16b1afcc66f2f45bca2acfa7c3f0bde791f50bbc6bb54be" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.143017 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-cv2c8" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.433246 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b764a92-4693-41e8-9b3b-5e4aad70a28e" path="/var/lib/kubelet/pods/0b764a92-4693-41e8-9b3b-5e4aad70a28e/volumes" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.703669 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6"] Mar 07 07:32:58 crc kubenswrapper[4738]: E0307 07:32:58.704043 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b764a92-4693-41e8-9b3b-5e4aad70a28e" containerName="swift-ring-rebalance" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.704059 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b764a92-4693-41e8-9b3b-5e4aad70a28e" containerName="swift-ring-rebalance" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.704274 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b764a92-4693-41e8-9b3b-5e4aad70a28e" containerName="swift-ring-rebalance" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.704844 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.711796 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.712223 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.722881 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6"] Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781285 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781398 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781504 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781583 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjj69\" (UniqueName: \"kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781648 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.781692 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.883759 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.883910 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.884037 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.884135 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjj69\" (UniqueName: \"kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.884281 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.884359 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.884866 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.885058 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.885608 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.890320 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.890805 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:58 crc kubenswrapper[4738]: I0307 07:32:58.920297 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjj69\" (UniqueName: \"kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69\") pod \"swift-ring-rebalance-debug-f7dm6\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:59 crc kubenswrapper[4738]: I0307 07:32:59.031546 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:32:59 crc kubenswrapper[4738]: I0307 07:32:59.529063 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6"] Mar 07 07:33:00 crc kubenswrapper[4738]: I0307 07:33:00.159767 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" event={"ID":"05984065-cdfc-4bcf-9445-1336bc28ee8c","Type":"ContainerStarted","Data":"eb4b7b2b8467332219a46116917aa28deeeb097854357965abee4ac3b59d453b"} Mar 07 07:33:00 crc kubenswrapper[4738]: I0307 07:33:00.160061 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" event={"ID":"05984065-cdfc-4bcf-9445-1336bc28ee8c","Type":"ContainerStarted","Data":"e30bcec49e25e06763c563dc017588c39db1a857ead2c92fff584121e74647fe"} Mar 07 07:33:00 crc kubenswrapper[4738]: I0307 07:33:00.196660 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" podStartSLOduration=2.196637983 podStartE2EDuration="2.196637983s" podCreationTimestamp="2026-03-07 07:32:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:00.185257645 +0000 UTC m=+1998.650245006" watchObservedRunningTime="2026-03-07 07:33:00.196637983 +0000 UTC m=+1998.661625324" Mar 07 07:33:02 crc kubenswrapper[4738]: I0307 07:33:02.178109 4738 generic.go:334] "Generic (PLEG): container finished" podID="05984065-cdfc-4bcf-9445-1336bc28ee8c" containerID="eb4b7b2b8467332219a46116917aa28deeeb097854357965abee4ac3b59d453b" exitCode=0 Mar 07 07:33:02 crc kubenswrapper[4738]: I0307 07:33:02.178177 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" event={"ID":"05984065-cdfc-4bcf-9445-1336bc28ee8c","Type":"ContainerDied","Data":"eb4b7b2b8467332219a46116917aa28deeeb097854357965abee4ac3b59d453b"} Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.469047 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.500855 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6"] Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.507840 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6"] Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.554681 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.554819 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.554854 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.554984 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.555605 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjj69\" (UniqueName: \"kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.555634 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts\") pod \"05984065-cdfc-4bcf-9445-1336bc28ee8c\" (UID: \"05984065-cdfc-4bcf-9445-1336bc28ee8c\") " Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.556140 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.556400 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.561063 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69" (OuterVolumeSpecName: "kube-api-access-cjj69") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "kube-api-access-cjj69". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.578408 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.579202 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.585310 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts" (OuterVolumeSpecName: "scripts") pod "05984065-cdfc-4bcf-9445-1336bc28ee8c" (UID: "05984065-cdfc-4bcf-9445-1336bc28ee8c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657366 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05984065-cdfc-4bcf-9445-1336bc28ee8c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657413 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657424 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657438 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjj69\" (UniqueName: \"kubernetes.io/projected/05984065-cdfc-4bcf-9445-1336bc28ee8c-kube-api-access-cjj69\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657452 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05984065-cdfc-4bcf-9445-1336bc28ee8c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:03 crc kubenswrapper[4738]: I0307 07:33:03.657463 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05984065-cdfc-4bcf-9445-1336bc28ee8c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.203350 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e30bcec49e25e06763c563dc017588c39db1a857ead2c92fff584121e74647fe" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.203423 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f7dm6" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.446842 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05984065-cdfc-4bcf-9445-1336bc28ee8c" path="/var/lib/kubelet/pods/05984065-cdfc-4bcf-9445-1336bc28ee8c/volumes" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.645225 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x"] Mar 07 07:33:04 crc kubenswrapper[4738]: E0307 07:33:04.645692 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05984065-cdfc-4bcf-9445-1336bc28ee8c" containerName="swift-ring-rebalance" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.645713 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="05984065-cdfc-4bcf-9445-1336bc28ee8c" containerName="swift-ring-rebalance" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.646067 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="05984065-cdfc-4bcf-9445-1336bc28ee8c" containerName="swift-ring-rebalance" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.647086 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.651316 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.653253 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x"] Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.654621 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.774506 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.774791 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.774922 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.775042 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.775170 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmgq8\" (UniqueName: \"kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.775285 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876408 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876497 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876528 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876560 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876610 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.876647 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmgq8\" (UniqueName: \"kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.877721 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.878733 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.879253 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.881649 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.886189 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.896344 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmgq8\" (UniqueName: \"kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8\") pod \"swift-ring-rebalance-debug-2xc4x\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:04 crc kubenswrapper[4738]: I0307 07:33:04.973275 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:05 crc kubenswrapper[4738]: I0307 07:33:05.276081 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x"] Mar 07 07:33:05 crc kubenswrapper[4738]: W0307 07:33:05.286067 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48e427c6_10cc_4d0e_ab4c_8e9e14fe6280.slice/crio-d17238ad8afd4be63216d745d8a5512a76fcc83250d1fb2039eb333d1937b322 WatchSource:0}: Error finding container d17238ad8afd4be63216d745d8a5512a76fcc83250d1fb2039eb333d1937b322: Status 404 returned error can't find the container with id d17238ad8afd4be63216d745d8a5512a76fcc83250d1fb2039eb333d1937b322 Mar 07 07:33:06 crc kubenswrapper[4738]: I0307 07:33:06.229359 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" event={"ID":"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280","Type":"ContainerStarted","Data":"b636b2bad93e52f58cdd67493996fa208c34e09f9a7b44d047365049ef19860b"} Mar 07 07:33:06 crc kubenswrapper[4738]: I0307 07:33:06.229694 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" event={"ID":"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280","Type":"ContainerStarted","Data":"d17238ad8afd4be63216d745d8a5512a76fcc83250d1fb2039eb333d1937b322"} Mar 07 07:33:07 crc kubenswrapper[4738]: I0307 07:33:07.242932 4738 generic.go:334] "Generic (PLEG): container finished" podID="48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" containerID="b636b2bad93e52f58cdd67493996fa208c34e09f9a7b44d047365049ef19860b" exitCode=0 Mar 07 07:33:07 crc kubenswrapper[4738]: I0307 07:33:07.243355 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" event={"ID":"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280","Type":"ContainerDied","Data":"b636b2bad93e52f58cdd67493996fa208c34e09f9a7b44d047365049ef19860b"} Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.595617 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.661319 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x"] Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.669321 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x"] Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.750567 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751137 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751231 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751318 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmgq8\" (UniqueName: \"kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751354 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751415 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf\") pod \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\" (UID: \"48e427c6-10cc-4d0e-ab4c-8e9e14fe6280\") " Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.751739 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.752001 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.753221 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.786371 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8" (OuterVolumeSpecName: "kube-api-access-kmgq8") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "kube-api-access-kmgq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.800791 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts" (OuterVolumeSpecName: "scripts") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.808317 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.816315 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" (UID: "48e427c6-10cc-4d0e-ab4c-8e9e14fe6280"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.852938 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.852974 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.852987 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmgq8\" (UniqueName: \"kubernetes.io/projected/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-kube-api-access-kmgq8\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.852999 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:08 crc kubenswrapper[4738]: I0307 07:33:08.853010 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.261659 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d17238ad8afd4be63216d745d8a5512a76fcc83250d1fb2039eb333d1937b322" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.261910 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2xc4x" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.385776 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:33:09 crc kubenswrapper[4738]: E0307 07:33:09.386036 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.789883 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq"] Mar 07 07:33:09 crc kubenswrapper[4738]: E0307 07:33:09.790742 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" containerName="swift-ring-rebalance" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.790802 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" containerName="swift-ring-rebalance" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.791185 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" containerName="swift-ring-rebalance" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.792076 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.794291 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.794989 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.800610 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq"] Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869246 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869344 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869484 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869574 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869679 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmjdq\" (UniqueName: \"kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.869749 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971740 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971813 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971853 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmjdq\" (UniqueName: \"kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971882 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971967 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.971998 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.972822 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.973042 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.973413 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.977734 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:09 crc kubenswrapper[4738]: I0307 07:33:09.980124 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:10 crc kubenswrapper[4738]: I0307 07:33:10.003550 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmjdq\" (UniqueName: \"kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq\") pod \"swift-ring-rebalance-debug-8n9vq\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:10 crc kubenswrapper[4738]: I0307 07:33:10.152450 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:10 crc kubenswrapper[4738]: I0307 07:33:10.396391 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e427c6-10cc-4d0e-ab4c-8e9e14fe6280" path="/var/lib/kubelet/pods/48e427c6-10cc-4d0e-ab4c-8e9e14fe6280/volumes" Mar 07 07:33:10 crc kubenswrapper[4738]: I0307 07:33:10.634350 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq"] Mar 07 07:33:11 crc kubenswrapper[4738]: I0307 07:33:11.286470 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" event={"ID":"ea493161-8ebf-4fb1-9fb2-b671f98eda1e","Type":"ContainerStarted","Data":"f98c9f011e414265be3c18216d73d08ce7cb52ab22eb8d26e6002da46eedb71a"} Mar 07 07:33:11 crc kubenswrapper[4738]: I0307 07:33:11.286835 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" event={"ID":"ea493161-8ebf-4fb1-9fb2-b671f98eda1e","Type":"ContainerStarted","Data":"900850f6f309261aa62ac035ce9412ff6aedd506625f9fdb6679c4d2c5ef0d8d"} Mar 07 07:33:11 crc kubenswrapper[4738]: I0307 07:33:11.319437 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" podStartSLOduration=2.31940987 podStartE2EDuration="2.31940987s" podCreationTimestamp="2026-03-07 07:33:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:11.313189891 +0000 UTC m=+2009.778177222" watchObservedRunningTime="2026-03-07 07:33:11.31940987 +0000 UTC m=+2009.784397231" Mar 07 07:33:13 crc kubenswrapper[4738]: I0307 07:33:13.311553 4738 generic.go:334] "Generic (PLEG): container finished" podID="ea493161-8ebf-4fb1-9fb2-b671f98eda1e" containerID="f98c9f011e414265be3c18216d73d08ce7cb52ab22eb8d26e6002da46eedb71a" exitCode=0 Mar 07 07:33:13 crc kubenswrapper[4738]: I0307 07:33:13.311654 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" event={"ID":"ea493161-8ebf-4fb1-9fb2-b671f98eda1e","Type":"ContainerDied","Data":"f98c9f011e414265be3c18216d73d08ce7cb52ab22eb8d26e6002da46eedb71a"} Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.578962 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.614666 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq"] Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.625271 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq"] Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658348 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658449 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmjdq\" (UniqueName: \"kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658557 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658572 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658618 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift\") pod \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\" (UID: \"ea493161-8ebf-4fb1-9fb2-b671f98eda1e\") " Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.658838 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.660191 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.667459 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq" (OuterVolumeSpecName: "kube-api-access-lmjdq") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "kube-api-access-lmjdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.683041 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.683291 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.686076 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts" (OuterVolumeSpecName: "scripts") pod "ea493161-8ebf-4fb1-9fb2-b671f98eda1e" (UID: "ea493161-8ebf-4fb1-9fb2-b671f98eda1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760830 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmjdq\" (UniqueName: \"kubernetes.io/projected/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-kube-api-access-lmjdq\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760862 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760872 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760884 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760894 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:14 crc kubenswrapper[4738]: I0307 07:33:14.760902 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ea493161-8ebf-4fb1-9fb2-b671f98eda1e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.331752 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="900850f6f309261aa62ac035ce9412ff6aedd506625f9fdb6679c4d2c5ef0d8d" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.332246 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8n9vq" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.819055 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn"] Mar 07 07:33:15 crc kubenswrapper[4738]: E0307 07:33:15.819524 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea493161-8ebf-4fb1-9fb2-b671f98eda1e" containerName="swift-ring-rebalance" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.819547 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea493161-8ebf-4fb1-9fb2-b671f98eda1e" containerName="swift-ring-rebalance" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.819826 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea493161-8ebf-4fb1-9fb2-b671f98eda1e" containerName="swift-ring-rebalance" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.820787 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.838557 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.844944 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.847088 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn"] Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.877907 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.877960 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.878007 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7lzw\" (UniqueName: \"kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.878031 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.878060 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.878112 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979081 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979187 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979256 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979272 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979302 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7lzw\" (UniqueName: \"kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979321 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.979776 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.980883 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.981224 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.987929 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:15 crc kubenswrapper[4738]: I0307 07:33:15.988324 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:16 crc kubenswrapper[4738]: I0307 07:33:16.003013 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7lzw\" (UniqueName: \"kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw\") pod \"swift-ring-rebalance-debug-8cwzn\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:16 crc kubenswrapper[4738]: I0307 07:33:16.144300 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:16 crc kubenswrapper[4738]: I0307 07:33:16.394392 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea493161-8ebf-4fb1-9fb2-b671f98eda1e" path="/var/lib/kubelet/pods/ea493161-8ebf-4fb1-9fb2-b671f98eda1e/volumes" Mar 07 07:33:16 crc kubenswrapper[4738]: I0307 07:33:16.636629 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn"] Mar 07 07:33:17 crc kubenswrapper[4738]: I0307 07:33:17.347951 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" event={"ID":"4710ef57-0b6e-415e-b564-193d6c8489ae","Type":"ContainerStarted","Data":"c87ece0ce05d82a833aa45c5cd191768c2d0189a01b176dcf0074a82cca3bf50"} Mar 07 07:33:17 crc kubenswrapper[4738]: I0307 07:33:17.349667 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" event={"ID":"4710ef57-0b6e-415e-b564-193d6c8489ae","Type":"ContainerStarted","Data":"d5ba6956fbc21196f6cb0ebe51529741a5698e14ba939088367ed5359810bd2c"} Mar 07 07:33:17 crc kubenswrapper[4738]: I0307 07:33:17.383942 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" podStartSLOduration=2.38391541 podStartE2EDuration="2.38391541s" podCreationTimestamp="2026-03-07 07:33:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:17.373538648 +0000 UTC m=+2015.838525979" watchObservedRunningTime="2026-03-07 07:33:17.38391541 +0000 UTC m=+2015.848902741" Mar 07 07:33:18 crc kubenswrapper[4738]: I0307 07:33:18.360785 4738 generic.go:334] "Generic (PLEG): container finished" podID="4710ef57-0b6e-415e-b564-193d6c8489ae" containerID="c87ece0ce05d82a833aa45c5cd191768c2d0189a01b176dcf0074a82cca3bf50" exitCode=0 Mar 07 07:33:18 crc kubenswrapper[4738]: I0307 07:33:18.360910 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" event={"ID":"4710ef57-0b6e-415e-b564-193d6c8489ae","Type":"ContainerDied","Data":"c87ece0ce05d82a833aa45c5cd191768c2d0189a01b176dcf0074a82cca3bf50"} Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.713791 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.745253 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn"] Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.750589 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn"] Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.835940 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836004 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836089 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836120 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836199 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7lzw\" (UniqueName: \"kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836246 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf\") pod \"4710ef57-0b6e-415e-b564-193d6c8489ae\" (UID: \"4710ef57-0b6e-415e-b564-193d6c8489ae\") " Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.836892 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.837475 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.842401 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw" (OuterVolumeSpecName: "kube-api-access-k7lzw") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "kube-api-access-k7lzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.857712 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.880002 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.881106 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts" (OuterVolumeSpecName: "scripts") pod "4710ef57-0b6e-415e-b564-193d6c8489ae" (UID: "4710ef57-0b6e-415e-b564-193d6c8489ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938031 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4710ef57-0b6e-415e-b564-193d6c8489ae-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938060 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938072 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7lzw\" (UniqueName: \"kubernetes.io/projected/4710ef57-0b6e-415e-b564-193d6c8489ae-kube-api-access-k7lzw\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938083 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938091 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4710ef57-0b6e-415e-b564-193d6c8489ae-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:19 crc kubenswrapper[4738]: I0307 07:33:19.938099 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4710ef57-0b6e-415e-b564-193d6c8489ae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:20 crc kubenswrapper[4738]: I0307 07:33:20.379543 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5ba6956fbc21196f6cb0ebe51529741a5698e14ba939088367ed5359810bd2c" Mar 07 07:33:20 crc kubenswrapper[4738]: I0307 07:33:20.379610 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8cwzn" Mar 07 07:33:20 crc kubenswrapper[4738]: I0307 07:33:20.385244 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:33:20 crc kubenswrapper[4738]: E0307 07:33:20.385538 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:33:20 crc kubenswrapper[4738]: I0307 07:33:20.394495 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4710ef57-0b6e-415e-b564-193d6c8489ae" path="/var/lib/kubelet/pods/4710ef57-0b6e-415e-b564-193d6c8489ae/volumes" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.301960 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn"] Mar 07 07:33:21 crc kubenswrapper[4738]: E0307 07:33:21.302666 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4710ef57-0b6e-415e-b564-193d6c8489ae" containerName="swift-ring-rebalance" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.302681 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4710ef57-0b6e-415e-b564-193d6c8489ae" containerName="swift-ring-rebalance" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.302865 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4710ef57-0b6e-415e-b564-193d6c8489ae" containerName="swift-ring-rebalance" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.303548 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.305483 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.305915 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.322664 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn"] Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.355928 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.356015 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.356044 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.356241 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rdbw\" (UniqueName: \"kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.356319 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.356390 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458294 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458372 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458499 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rdbw\" (UniqueName: \"kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458544 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458583 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.458651 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.459781 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.460255 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.460324 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.465992 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.466037 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.487084 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rdbw\" (UniqueName: \"kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw\") pod \"swift-ring-rebalance-debug-vr2pn\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.636249 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:21 crc kubenswrapper[4738]: I0307 07:33:21.900416 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn"] Mar 07 07:33:22 crc kubenswrapper[4738]: I0307 07:33:22.404025 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" event={"ID":"7afa681a-0d05-406d-91f8-e7d41ff17547","Type":"ContainerStarted","Data":"a50bb74c16bc932859d2d2d1802bb162810bdf4adcd21ab01491aac26ca66b9f"} Mar 07 07:33:22 crc kubenswrapper[4738]: I0307 07:33:22.404088 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" event={"ID":"7afa681a-0d05-406d-91f8-e7d41ff17547","Type":"ContainerStarted","Data":"650568669c6ef7e4ef43422fdb19044735f121cbe3da331526741fd50002eb52"} Mar 07 07:33:22 crc kubenswrapper[4738]: I0307 07:33:22.447987 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" podStartSLOduration=1.447966487 podStartE2EDuration="1.447966487s" podCreationTimestamp="2026-03-07 07:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:22.444894484 +0000 UTC m=+2020.909881835" watchObservedRunningTime="2026-03-07 07:33:22.447966487 +0000 UTC m=+2020.912953828" Mar 07 07:33:23 crc kubenswrapper[4738]: I0307 07:33:23.419633 4738 generic.go:334] "Generic (PLEG): container finished" podID="7afa681a-0d05-406d-91f8-e7d41ff17547" containerID="a50bb74c16bc932859d2d2d1802bb162810bdf4adcd21ab01491aac26ca66b9f" exitCode=0 Mar 07 07:33:23 crc kubenswrapper[4738]: I0307 07:33:23.419732 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" event={"ID":"7afa681a-0d05-406d-91f8-e7d41ff17547","Type":"ContainerDied","Data":"a50bb74c16bc932859d2d2d1802bb162810bdf4adcd21ab01491aac26ca66b9f"} Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.769218 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828630 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828679 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828789 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828816 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828836 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rdbw\" (UniqueName: \"kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.828861 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift\") pod \"7afa681a-0d05-406d-91f8-e7d41ff17547\" (UID: \"7afa681a-0d05-406d-91f8-e7d41ff17547\") " Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.829810 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.829868 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.836111 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw" (OuterVolumeSpecName: "kube-api-access-5rdbw") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "kube-api-access-5rdbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.847432 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts" (OuterVolumeSpecName: "scripts") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.848338 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.849284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7afa681a-0d05-406d-91f8-e7d41ff17547" (UID: "7afa681a-0d05-406d-91f8-e7d41ff17547"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.930703 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.931037 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7afa681a-0d05-406d-91f8-e7d41ff17547-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.931136 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rdbw\" (UniqueName: \"kubernetes.io/projected/7afa681a-0d05-406d-91f8-e7d41ff17547-kube-api-access-5rdbw\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.931261 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7afa681a-0d05-406d-91f8-e7d41ff17547-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.931357 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:24 crc kubenswrapper[4738]: I0307 07:33:24.931443 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7afa681a-0d05-406d-91f8-e7d41ff17547-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:25 crc kubenswrapper[4738]: I0307 07:33:25.442776 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" event={"ID":"7afa681a-0d05-406d-91f8-e7d41ff17547","Type":"ContainerDied","Data":"650568669c6ef7e4ef43422fdb19044735f121cbe3da331526741fd50002eb52"} Mar 07 07:33:25 crc kubenswrapper[4738]: I0307 07:33:25.442839 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="650568669c6ef7e4ef43422fdb19044735f121cbe3da331526741fd50002eb52" Mar 07 07:33:25 crc kubenswrapper[4738]: I0307 07:33:25.443369 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn" Mar 07 07:33:26 crc kubenswrapper[4738]: I0307 07:33:26.144651 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn"] Mar 07 07:33:26 crc kubenswrapper[4738]: I0307 07:33:26.150082 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vr2pn"] Mar 07 07:33:26 crc kubenswrapper[4738]: I0307 07:33:26.393844 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7afa681a-0d05-406d-91f8-e7d41ff17547" path="/var/lib/kubelet/pods/7afa681a-0d05-406d-91f8-e7d41ff17547/volumes" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.895312 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4"] Mar 07 07:33:27 crc kubenswrapper[4738]: E0307 07:33:27.895639 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7afa681a-0d05-406d-91f8-e7d41ff17547" containerName="swift-ring-rebalance" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.895658 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7afa681a-0d05-406d-91f8-e7d41ff17547" containerName="swift-ring-rebalance" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.895907 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7afa681a-0d05-406d-91f8-e7d41ff17547" containerName="swift-ring-rebalance" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.896496 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.901908 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.903762 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.914639 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4"] Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.979264 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.979626 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btpdx\" (UniqueName: \"kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.979834 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.980043 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.980302 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:27 crc kubenswrapper[4738]: I0307 07:33:27.980551 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082591 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082680 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btpdx\" (UniqueName: \"kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082727 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082826 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082879 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.082930 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.083784 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.083984 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.084033 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.089382 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.090737 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.112290 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btpdx\" (UniqueName: \"kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx\") pod \"swift-ring-rebalance-debug-mgsc4\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.216525 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:28 crc kubenswrapper[4738]: I0307 07:33:28.638122 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4"] Mar 07 07:33:29 crc kubenswrapper[4738]: I0307 07:33:29.478880 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" event={"ID":"c1f1c711-8640-47e2-801e-fef3feefe876","Type":"ContainerStarted","Data":"38cc97c2efe9f05739b9bb2e574eb5529cb8cf81ce238710267c19ad649cf350"} Mar 07 07:33:29 crc kubenswrapper[4738]: I0307 07:33:29.479303 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" event={"ID":"c1f1c711-8640-47e2-801e-fef3feefe876","Type":"ContainerStarted","Data":"051bacfae2453c8e1537afee62afe4b30e4d48b3f32c848a72cf3dcaac3f6a67"} Mar 07 07:33:30 crc kubenswrapper[4738]: I0307 07:33:30.492247 4738 generic.go:334] "Generic (PLEG): container finished" podID="c1f1c711-8640-47e2-801e-fef3feefe876" containerID="38cc97c2efe9f05739b9bb2e574eb5529cb8cf81ce238710267c19ad649cf350" exitCode=0 Mar 07 07:33:30 crc kubenswrapper[4738]: I0307 07:33:30.492311 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" event={"ID":"c1f1c711-8640-47e2-801e-fef3feefe876","Type":"ContainerDied","Data":"38cc97c2efe9f05739b9bb2e574eb5529cb8cf81ce238710267c19ad649cf350"} Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.881923 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.932325 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4"] Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.948355 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4"] Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954357 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btpdx\" (UniqueName: \"kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954436 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954538 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954659 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954688 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.954781 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf\") pod \"c1f1c711-8640-47e2-801e-fef3feefe876\" (UID: \"c1f1c711-8640-47e2-801e-fef3feefe876\") " Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.955991 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.957534 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.962812 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx" (OuterVolumeSpecName: "kube-api-access-btpdx") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "kube-api-access-btpdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.980039 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts" (OuterVolumeSpecName: "scripts") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:31 crc kubenswrapper[4738]: I0307 07:33:31.996284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.000362 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "c1f1c711-8640-47e2-801e-fef3feefe876" (UID: "c1f1c711-8640-47e2-801e-fef3feefe876"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056336 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056366 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056376 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/c1f1c711-8640-47e2-801e-fef3feefe876-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056385 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btpdx\" (UniqueName: \"kubernetes.io/projected/c1f1c711-8640-47e2-801e-fef3feefe876-kube-api-access-btpdx\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056394 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/c1f1c711-8640-47e2-801e-fef3feefe876-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.056404 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c1f1c711-8640-47e2-801e-fef3feefe876-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.397633 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1f1c711-8640-47e2-801e-fef3feefe876" path="/var/lib/kubelet/pods/c1f1c711-8640-47e2-801e-fef3feefe876/volumes" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.515570 4738 scope.go:117] "RemoveContainer" containerID="38cc97c2efe9f05739b9bb2e574eb5529cb8cf81ce238710267c19ad649cf350" Mar 07 07:33:32 crc kubenswrapper[4738]: I0307 07:33:32.515626 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mgsc4" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.109624 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4l827"] Mar 07 07:33:33 crc kubenswrapper[4738]: E0307 07:33:33.110342 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1f1c711-8640-47e2-801e-fef3feefe876" containerName="swift-ring-rebalance" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.110365 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1f1c711-8640-47e2-801e-fef3feefe876" containerName="swift-ring-rebalance" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.110564 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1f1c711-8640-47e2-801e-fef3feefe876" containerName="swift-ring-rebalance" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.111221 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.113511 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.114537 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.126632 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4l827"] Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173288 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173352 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173378 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173405 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173626 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.173915 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.275863 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.276049 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.276272 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.276337 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.276376 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.276420 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.277296 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.277834 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.278732 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.284520 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.287948 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.299756 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9\") pod \"swift-ring-rebalance-debug-4l827\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.385664 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:33:33 crc kubenswrapper[4738]: E0307 07:33:33.385925 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.440566 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:33 crc kubenswrapper[4738]: I0307 07:33:33.688654 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4l827"] Mar 07 07:33:33 crc kubenswrapper[4738]: W0307 07:33:33.692373 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod202abe38_52ee_4770_b153_61aba782eb4e.slice/crio-7b7baa62825a4ff366338bd0566910ccb5f8a3bf3525c82ad429453e1cf0096d WatchSource:0}: Error finding container 7b7baa62825a4ff366338bd0566910ccb5f8a3bf3525c82ad429453e1cf0096d: Status 404 returned error can't find the container with id 7b7baa62825a4ff366338bd0566910ccb5f8a3bf3525c82ad429453e1cf0096d Mar 07 07:33:34 crc kubenswrapper[4738]: I0307 07:33:34.541584 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" event={"ID":"202abe38-52ee-4770-b153-61aba782eb4e","Type":"ContainerStarted","Data":"f4020892bbabb1763f0f8b2ef48e9f8e6d2d664aa3f6b1e9363d7d637260bed3"} Mar 07 07:33:34 crc kubenswrapper[4738]: I0307 07:33:34.542025 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" event={"ID":"202abe38-52ee-4770-b153-61aba782eb4e","Type":"ContainerStarted","Data":"7b7baa62825a4ff366338bd0566910ccb5f8a3bf3525c82ad429453e1cf0096d"} Mar 07 07:33:34 crc kubenswrapper[4738]: I0307 07:33:34.568231 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" podStartSLOduration=1.5682033770000001 podStartE2EDuration="1.568203377s" podCreationTimestamp="2026-03-07 07:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:34.563312755 +0000 UTC m=+2033.028300116" watchObservedRunningTime="2026-03-07 07:33:34.568203377 +0000 UTC m=+2033.033190738" Mar 07 07:33:35 crc kubenswrapper[4738]: I0307 07:33:35.552844 4738 generic.go:334] "Generic (PLEG): container finished" podID="202abe38-52ee-4770-b153-61aba782eb4e" containerID="f4020892bbabb1763f0f8b2ef48e9f8e6d2d664aa3f6b1e9363d7d637260bed3" exitCode=0 Mar 07 07:33:35 crc kubenswrapper[4738]: I0307 07:33:35.552910 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" event={"ID":"202abe38-52ee-4770-b153-61aba782eb4e","Type":"ContainerDied","Data":"f4020892bbabb1763f0f8b2ef48e9f8e6d2d664aa3f6b1e9363d7d637260bed3"} Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.896401 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.936235 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.936358 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.936407 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.936783 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.937178 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.937733 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.937856 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf\") pod \"202abe38-52ee-4770-b153-61aba782eb4e\" (UID: \"202abe38-52ee-4770-b153-61aba782eb4e\") " Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.938360 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.938408 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.940083 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4l827"] Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.943246 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9" (OuterVolumeSpecName: "kube-api-access-4g7n9") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "kube-api-access-4g7n9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.945606 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4l827"] Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.957309 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts" (OuterVolumeSpecName: "scripts") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.958977 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:36 crc kubenswrapper[4738]: I0307 07:33:36.972021 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "202abe38-52ee-4770-b153-61aba782eb4e" (UID: "202abe38-52ee-4770-b153-61aba782eb4e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.040116 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7n9\" (UniqueName: \"kubernetes.io/projected/202abe38-52ee-4770-b153-61aba782eb4e-kube-api-access-4g7n9\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.040165 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/202abe38-52ee-4770-b153-61aba782eb4e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.040179 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.040191 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/202abe38-52ee-4770-b153-61aba782eb4e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.040201 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/202abe38-52ee-4770-b153-61aba782eb4e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.577244 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b7baa62825a4ff366338bd0566910ccb5f8a3bf3525c82ad429453e1cf0096d" Mar 07 07:33:37 crc kubenswrapper[4738]: I0307 07:33:37.577344 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4l827" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.125598 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq"] Mar 07 07:33:38 crc kubenswrapper[4738]: E0307 07:33:38.126324 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="202abe38-52ee-4770-b153-61aba782eb4e" containerName="swift-ring-rebalance" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.126367 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="202abe38-52ee-4770-b153-61aba782eb4e" containerName="swift-ring-rebalance" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.126743 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="202abe38-52ee-4770-b153-61aba782eb4e" containerName="swift-ring-rebalance" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.127932 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.130077 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.130646 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.133394 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq"] Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259516 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259590 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259642 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znff5\" (UniqueName: \"kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259683 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259779 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.259824 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.361382 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.361737 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.361857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znff5\" (UniqueName: \"kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.361977 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.362091 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.362242 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.362679 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.362984 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.363333 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.368119 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.371657 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.397827 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znff5\" (UniqueName: \"kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5\") pod \"swift-ring-rebalance-debug-7h7jq\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.404352 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="202abe38-52ee-4770-b153-61aba782eb4e" path="/var/lib/kubelet/pods/202abe38-52ee-4770-b153-61aba782eb4e/volumes" Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.496107 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:38 crc kubenswrapper[4738]: W0307 07:33:38.753427 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7580840f_80e0_4c09_b316_f080b56c9b6f.slice/crio-f743df5dd5cf67fb3b240526712f8b5370590fd21a91767f28234e50e1e5a5fa WatchSource:0}: Error finding container f743df5dd5cf67fb3b240526712f8b5370590fd21a91767f28234e50e1e5a5fa: Status 404 returned error can't find the container with id f743df5dd5cf67fb3b240526712f8b5370590fd21a91767f28234e50e1e5a5fa Mar 07 07:33:38 crc kubenswrapper[4738]: I0307 07:33:38.758968 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq"] Mar 07 07:33:39 crc kubenswrapper[4738]: I0307 07:33:39.597555 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" event={"ID":"7580840f-80e0-4c09-b316-f080b56c9b6f","Type":"ContainerStarted","Data":"323c30b388a54571d03817c05a6a4531c1d12a89220da7ef173f35fc30800e8c"} Mar 07 07:33:39 crc kubenswrapper[4738]: I0307 07:33:39.597972 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" event={"ID":"7580840f-80e0-4c09-b316-f080b56c9b6f","Type":"ContainerStarted","Data":"f743df5dd5cf67fb3b240526712f8b5370590fd21a91767f28234e50e1e5a5fa"} Mar 07 07:33:39 crc kubenswrapper[4738]: I0307 07:33:39.620464 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" podStartSLOduration=1.620446813 podStartE2EDuration="1.620446813s" podCreationTimestamp="2026-03-07 07:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:39.616856755 +0000 UTC m=+2038.081844116" watchObservedRunningTime="2026-03-07 07:33:39.620446813 +0000 UTC m=+2038.085434134" Mar 07 07:33:40 crc kubenswrapper[4738]: I0307 07:33:40.611854 4738 generic.go:334] "Generic (PLEG): container finished" podID="7580840f-80e0-4c09-b316-f080b56c9b6f" containerID="323c30b388a54571d03817c05a6a4531c1d12a89220da7ef173f35fc30800e8c" exitCode=0 Mar 07 07:33:40 crc kubenswrapper[4738]: I0307 07:33:40.611920 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" event={"ID":"7580840f-80e0-4c09-b316-f080b56c9b6f","Type":"ContainerDied","Data":"323c30b388a54571d03817c05a6a4531c1d12a89220da7ef173f35fc30800e8c"} Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.032786 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.080367 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq"] Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.087643 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq"] Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222680 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znff5\" (UniqueName: \"kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222740 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222803 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222821 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222857 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.222874 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift\") pod \"7580840f-80e0-4c09-b316-f080b56c9b6f\" (UID: \"7580840f-80e0-4c09-b316-f080b56c9b6f\") " Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.223916 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.224435 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.236350 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5" (OuterVolumeSpecName: "kube-api-access-znff5") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "kube-api-access-znff5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.244343 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.244803 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.266116 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts" (OuterVolumeSpecName: "scripts") pod "7580840f-80e0-4c09-b316-f080b56c9b6f" (UID: "7580840f-80e0-4c09-b316-f080b56c9b6f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.325861 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znff5\" (UniqueName: \"kubernetes.io/projected/7580840f-80e0-4c09-b316-f080b56c9b6f-kube-api-access-znff5\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.325956 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.325987 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7580840f-80e0-4c09-b316-f080b56c9b6f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.326012 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.326041 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7580840f-80e0-4c09-b316-f080b56c9b6f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.326065 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7580840f-80e0-4c09-b316-f080b56c9b6f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.407282 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7580840f-80e0-4c09-b316-f080b56c9b6f" path="/var/lib/kubelet/pods/7580840f-80e0-4c09-b316-f080b56c9b6f/volumes" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.633063 4738 scope.go:117] "RemoveContainer" containerID="323c30b388a54571d03817c05a6a4531c1d12a89220da7ef173f35fc30800e8c" Mar 07 07:33:42 crc kubenswrapper[4738]: I0307 07:33:42.633100 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7h7jq" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.226415 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-llzwz"] Mar 07 07:33:43 crc kubenswrapper[4738]: E0307 07:33:43.227469 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7580840f-80e0-4c09-b316-f080b56c9b6f" containerName="swift-ring-rebalance" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.227501 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7580840f-80e0-4c09-b316-f080b56c9b6f" containerName="swift-ring-rebalance" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.227939 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7580840f-80e0-4c09-b316-f080b56c9b6f" containerName="swift-ring-rebalance" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.229109 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.232545 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.232775 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.239676 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-llzwz"] Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340028 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340133 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjkxm\" (UniqueName: \"kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340245 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340320 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340357 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.340437 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441475 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441535 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjkxm\" (UniqueName: \"kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441573 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441622 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441645 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.441700 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.442416 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.442478 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.442779 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.449628 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.457037 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.467135 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjkxm\" (UniqueName: \"kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm\") pod \"swift-ring-rebalance-debug-llzwz\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.560618 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:43 crc kubenswrapper[4738]: I0307 07:33:43.990281 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-llzwz"] Mar 07 07:33:44 crc kubenswrapper[4738]: I0307 07:33:44.665854 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" event={"ID":"3644487d-871e-460b-99e4-1ae30b5f8e0f","Type":"ContainerStarted","Data":"142b3c49962feb4ee438dbc912d98c32f7bb88c4b9c90a420590eaea712f8dd1"} Mar 07 07:33:44 crc kubenswrapper[4738]: I0307 07:33:44.666368 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" event={"ID":"3644487d-871e-460b-99e4-1ae30b5f8e0f","Type":"ContainerStarted","Data":"61137231462ff20e1bfd3042982ad479a9a75a7059fd24a4d5a7f178582fa918"} Mar 07 07:33:44 crc kubenswrapper[4738]: I0307 07:33:44.698889 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" podStartSLOduration=1.698870686 podStartE2EDuration="1.698870686s" podCreationTimestamp="2026-03-07 07:33:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:44.691510047 +0000 UTC m=+2043.156497378" watchObservedRunningTime="2026-03-07 07:33:44.698870686 +0000 UTC m=+2043.163858007" Mar 07 07:33:45 crc kubenswrapper[4738]: I0307 07:33:45.680413 4738 generic.go:334] "Generic (PLEG): container finished" podID="3644487d-871e-460b-99e4-1ae30b5f8e0f" containerID="142b3c49962feb4ee438dbc912d98c32f7bb88c4b9c90a420590eaea712f8dd1" exitCode=0 Mar 07 07:33:45 crc kubenswrapper[4738]: I0307 07:33:45.680525 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" event={"ID":"3644487d-871e-460b-99e4-1ae30b5f8e0f","Type":"ContainerDied","Data":"142b3c49962feb4ee438dbc912d98c32f7bb88c4b9c90a420590eaea712f8dd1"} Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.084413 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.119337 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-llzwz"] Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.125584 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-llzwz"] Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198463 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198624 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198669 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198748 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjkxm\" (UniqueName: \"kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.198809 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift\") pod \"3644487d-871e-460b-99e4-1ae30b5f8e0f\" (UID: \"3644487d-871e-460b-99e4-1ae30b5f8e0f\") " Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.199358 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.199596 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.203915 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm" (OuterVolumeSpecName: "kube-api-access-kjkxm") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "kube-api-access-kjkxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.218672 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.218885 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts" (OuterVolumeSpecName: "scripts") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.225519 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3644487d-871e-460b-99e4-1ae30b5f8e0f" (UID: "3644487d-871e-460b-99e4-1ae30b5f8e0f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300530 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300561 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3644487d-871e-460b-99e4-1ae30b5f8e0f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300570 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300580 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3644487d-871e-460b-99e4-1ae30b5f8e0f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300589 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjkxm\" (UniqueName: \"kubernetes.io/projected/3644487d-871e-460b-99e4-1ae30b5f8e0f-kube-api-access-kjkxm\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.300599 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3644487d-871e-460b-99e4-1ae30b5f8e0f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.386455 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:33:47 crc kubenswrapper[4738]: E0307 07:33:47.386845 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.701630 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="61137231462ff20e1bfd3042982ad479a9a75a7059fd24a4d5a7f178582fa918" Mar 07 07:33:47 crc kubenswrapper[4738]: I0307 07:33:47.701900 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-llzwz" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.317542 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7257r"] Mar 07 07:33:48 crc kubenswrapper[4738]: E0307 07:33:48.318483 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3644487d-871e-460b-99e4-1ae30b5f8e0f" containerName="swift-ring-rebalance" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.318508 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3644487d-871e-460b-99e4-1ae30b5f8e0f" containerName="swift-ring-rebalance" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.318847 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3644487d-871e-460b-99e4-1ae30b5f8e0f" containerName="swift-ring-rebalance" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.319712 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.322078 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.322691 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.327184 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7257r"] Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.398589 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3644487d-871e-460b-99e4-1ae30b5f8e0f" path="/var/lib/kubelet/pods/3644487d-871e-460b-99e4-1ae30b5f8e0f/volumes" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.416891 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.416953 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.417019 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.417056 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dznzl\" (UniqueName: \"kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.417084 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.417288 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.518704 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.518799 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.518830 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.519934 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.520314 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.520466 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dznzl\" (UniqueName: \"kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.520714 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.521242 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.521448 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.525766 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.527474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.550241 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dznzl\" (UniqueName: \"kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl\") pod \"swift-ring-rebalance-debug-7257r\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:48 crc kubenswrapper[4738]: I0307 07:33:48.645941 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:49 crc kubenswrapper[4738]: I0307 07:33:49.079427 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7257r"] Mar 07 07:33:49 crc kubenswrapper[4738]: W0307 07:33:49.087944 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46371604_ccf6_4fb1_a757_e1b29f694a9c.slice/crio-8e2ebbd992a61874a22313a2b685054f1b1ba01c489f6dd86402d1ce154cdd1b WatchSource:0}: Error finding container 8e2ebbd992a61874a22313a2b685054f1b1ba01c489f6dd86402d1ce154cdd1b: Status 404 returned error can't find the container with id 8e2ebbd992a61874a22313a2b685054f1b1ba01c489f6dd86402d1ce154cdd1b Mar 07 07:33:49 crc kubenswrapper[4738]: I0307 07:33:49.726933 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" event={"ID":"46371604-ccf6-4fb1-a757-e1b29f694a9c","Type":"ContainerStarted","Data":"79dd817c5ec883a3e4f355ad71041463e5a8e7175d3797da99fb185fe4943049"} Mar 07 07:33:49 crc kubenswrapper[4738]: I0307 07:33:49.727341 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" event={"ID":"46371604-ccf6-4fb1-a757-e1b29f694a9c","Type":"ContainerStarted","Data":"8e2ebbd992a61874a22313a2b685054f1b1ba01c489f6dd86402d1ce154cdd1b"} Mar 07 07:33:49 crc kubenswrapper[4738]: I0307 07:33:49.748668 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" podStartSLOduration=1.7486443839999999 podStartE2EDuration="1.748644384s" podCreationTimestamp="2026-03-07 07:33:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:49.747453422 +0000 UTC m=+2048.212440783" watchObservedRunningTime="2026-03-07 07:33:49.748644384 +0000 UTC m=+2048.213631735" Mar 07 07:33:50 crc kubenswrapper[4738]: I0307 07:33:50.738987 4738 generic.go:334] "Generic (PLEG): container finished" podID="46371604-ccf6-4fb1-a757-e1b29f694a9c" containerID="79dd817c5ec883a3e4f355ad71041463e5a8e7175d3797da99fb185fe4943049" exitCode=0 Mar 07 07:33:50 crc kubenswrapper[4738]: I0307 07:33:50.739050 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" event={"ID":"46371604-ccf6-4fb1-a757-e1b29f694a9c","Type":"ContainerDied","Data":"79dd817c5ec883a3e4f355ad71041463e5a8e7175d3797da99fb185fe4943049"} Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.116409 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.161303 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7257r"] Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.171799 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7257r"] Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.281737 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.281820 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.281904 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.282006 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.282062 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dznzl\" (UniqueName: \"kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.282132 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts\") pod \"46371604-ccf6-4fb1-a757-e1b29f694a9c\" (UID: \"46371604-ccf6-4fb1-a757-e1b29f694a9c\") " Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.283291 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.283475 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.294203 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl" (OuterVolumeSpecName: "kube-api-access-dznzl") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "kube-api-access-dznzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.307046 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts" (OuterVolumeSpecName: "scripts") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.323296 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.332478 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "46371604-ccf6-4fb1-a757-e1b29f694a9c" (UID: "46371604-ccf6-4fb1-a757-e1b29f694a9c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384684 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384735 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/46371604-ccf6-4fb1-a757-e1b29f694a9c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384754 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/46371604-ccf6-4fb1-a757-e1b29f694a9c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384778 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384796 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/46371604-ccf6-4fb1-a757-e1b29f694a9c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.384812 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dznzl\" (UniqueName: \"kubernetes.io/projected/46371604-ccf6-4fb1-a757-e1b29f694a9c-kube-api-access-dznzl\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.400886 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46371604-ccf6-4fb1-a757-e1b29f694a9c" path="/var/lib/kubelet/pods/46371604-ccf6-4fb1-a757-e1b29f694a9c/volumes" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.759097 4738 scope.go:117] "RemoveContainer" containerID="79dd817c5ec883a3e4f355ad71041463e5a8e7175d3797da99fb185fe4943049" Mar 07 07:33:52 crc kubenswrapper[4738]: I0307 07:33:52.759223 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7257r" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.304896 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd"] Mar 07 07:33:53 crc kubenswrapper[4738]: E0307 07:33:53.305553 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46371604-ccf6-4fb1-a757-e1b29f694a9c" containerName="swift-ring-rebalance" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.305586 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="46371604-ccf6-4fb1-a757-e1b29f694a9c" containerName="swift-ring-rebalance" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.305937 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="46371604-ccf6-4fb1-a757-e1b29f694a9c" containerName="swift-ring-rebalance" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.307101 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.310007 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.310563 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.316127 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd"] Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.404441 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.405102 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.405478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.405709 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p556\" (UniqueName: \"kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.406011 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.406450 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508418 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508479 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p556\" (UniqueName: \"kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508556 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508680 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508727 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.508775 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.509820 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.510373 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.511489 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.516988 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.518480 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.531182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p556\" (UniqueName: \"kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556\") pod \"swift-ring-rebalance-debug-j6nkd\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.629867 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:53 crc kubenswrapper[4738]: I0307 07:33:53.864226 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd"] Mar 07 07:33:54 crc kubenswrapper[4738]: I0307 07:33:54.784261 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" event={"ID":"5f0331e0-2a87-4f7f-968e-516e74ef9ee5","Type":"ContainerStarted","Data":"693a9c223b7503ece2072a18d5844b4534777b36159c195ebf6f3273a9d6d87d"} Mar 07 07:33:54 crc kubenswrapper[4738]: I0307 07:33:54.784623 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" event={"ID":"5f0331e0-2a87-4f7f-968e-516e74ef9ee5","Type":"ContainerStarted","Data":"54651b08bf4f3b4b3be06ccebf9af06307c5b77718598cb43d5fe9c536a32660"} Mar 07 07:33:54 crc kubenswrapper[4738]: I0307 07:33:54.818455 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" podStartSLOduration=1.818427535 podStartE2EDuration="1.818427535s" podCreationTimestamp="2026-03-07 07:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:54.808018763 +0000 UTC m=+2053.273006134" watchObservedRunningTime="2026-03-07 07:33:54.818427535 +0000 UTC m=+2053.283414886" Mar 07 07:33:55 crc kubenswrapper[4738]: I0307 07:33:55.795028 4738 generic.go:334] "Generic (PLEG): container finished" podID="5f0331e0-2a87-4f7f-968e-516e74ef9ee5" containerID="693a9c223b7503ece2072a18d5844b4534777b36159c195ebf6f3273a9d6d87d" exitCode=0 Mar 07 07:33:55 crc kubenswrapper[4738]: I0307 07:33:55.795228 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" event={"ID":"5f0331e0-2a87-4f7f-968e-516e74ef9ee5","Type":"ContainerDied","Data":"693a9c223b7503ece2072a18d5844b4534777b36159c195ebf6f3273a9d6d87d"} Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.141820 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.179208 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd"] Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.184391 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd"] Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.268730 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.268773 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p556\" (UniqueName: \"kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.268828 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.268894 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.268927 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.269001 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf\") pod \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\" (UID: \"5f0331e0-2a87-4f7f-968e-516e74ef9ee5\") " Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.269621 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.270361 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.276446 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556" (OuterVolumeSpecName: "kube-api-access-4p556") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "kube-api-access-4p556". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.291501 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.298017 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts" (OuterVolumeSpecName: "scripts") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.299530 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5f0331e0-2a87-4f7f-968e-516e74ef9ee5" (UID: "5f0331e0-2a87-4f7f-968e-516e74ef9ee5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370855 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370888 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p556\" (UniqueName: \"kubernetes.io/projected/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-kube-api-access-4p556\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370899 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370909 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370917 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.370925 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f0331e0-2a87-4f7f-968e-516e74ef9ee5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.814702 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54651b08bf4f3b4b3be06ccebf9af06307c5b77718598cb43d5fe9c536a32660" Mar 07 07:33:57 crc kubenswrapper[4738]: I0307 07:33:57.814772 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-j6nkd" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.356749 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb"] Mar 07 07:33:58 crc kubenswrapper[4738]: E0307 07:33:58.357127 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f0331e0-2a87-4f7f-968e-516e74ef9ee5" containerName="swift-ring-rebalance" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.357144 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f0331e0-2a87-4f7f-968e-516e74ef9ee5" containerName="swift-ring-rebalance" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.357356 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f0331e0-2a87-4f7f-968e-516e74ef9ee5" containerName="swift-ring-rebalance" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.357916 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.359851 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.362265 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.362473 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb"] Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.398547 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f0331e0-2a87-4f7f-968e-516e74ef9ee5" path="/var/lib/kubelet/pods/5f0331e0-2a87-4f7f-968e-516e74ef9ee5/volumes" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.487494 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.487922 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.488022 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.488135 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.488352 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9k9mr\" (UniqueName: \"kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.488433 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590363 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590490 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590564 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590680 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9k9mr\" (UniqueName: \"kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.590714 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.591241 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.591754 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.592300 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.597844 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.601263 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.614977 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9k9mr\" (UniqueName: \"kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr\") pod \"swift-ring-rebalance-debug-vsqpb\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:58 crc kubenswrapper[4738]: I0307 07:33:58.711430 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:33:59 crc kubenswrapper[4738]: I0307 07:33:59.170564 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb"] Mar 07 07:33:59 crc kubenswrapper[4738]: I0307 07:33:59.838590 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" event={"ID":"bc778a31-3a41-4260-92a3-c1acfba08e32","Type":"ContainerStarted","Data":"577e9e7851da972d4f3b4c87c9821e29777d6577b4d2b80f3c0197b1e948bf55"} Mar 07 07:33:59 crc kubenswrapper[4738]: I0307 07:33:59.838630 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" event={"ID":"bc778a31-3a41-4260-92a3-c1acfba08e32","Type":"ContainerStarted","Data":"05ee2dc5bb72a36989c919a841abb2520e08e97039c851a88f14917d2ac0a3db"} Mar 07 07:33:59 crc kubenswrapper[4738]: I0307 07:33:59.862544 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" podStartSLOduration=1.8625208290000002 podStartE2EDuration="1.862520829s" podCreationTimestamp="2026-03-07 07:33:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:33:59.860830574 +0000 UTC m=+2058.325817905" watchObservedRunningTime="2026-03-07 07:33:59.862520829 +0000 UTC m=+2058.327508190" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.145723 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547814-wjw5r"] Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.147886 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.154309 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.154414 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.156066 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.163758 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-wjw5r"] Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.221816 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6z7ml\" (UniqueName: \"kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml\") pod \"auto-csr-approver-29547814-wjw5r\" (UID: \"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e\") " pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.324133 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6z7ml\" (UniqueName: \"kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml\") pod \"auto-csr-approver-29547814-wjw5r\" (UID: \"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e\") " pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.356930 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6z7ml\" (UniqueName: \"kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml\") pod \"auto-csr-approver-29547814-wjw5r\" (UID: \"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e\") " pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.472901 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.746221 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-wjw5r"] Mar 07 07:34:00 crc kubenswrapper[4738]: W0307 07:34:00.789777 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6d4ad66_c41b_42f2_9b0d_3e54aa59443e.slice/crio-422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84 WatchSource:0}: Error finding container 422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84: Status 404 returned error can't find the container with id 422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84 Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.793820 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.851417 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" event={"ID":"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e","Type":"ContainerStarted","Data":"422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84"} Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.853182 4738 generic.go:334] "Generic (PLEG): container finished" podID="bc778a31-3a41-4260-92a3-c1acfba08e32" containerID="577e9e7851da972d4f3b4c87c9821e29777d6577b4d2b80f3c0197b1e948bf55" exitCode=0 Mar 07 07:34:00 crc kubenswrapper[4738]: I0307 07:34:00.853212 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" event={"ID":"bc778a31-3a41-4260-92a3-c1acfba08e32","Type":"ContainerDied","Data":"577e9e7851da972d4f3b4c87c9821e29777d6577b4d2b80f3c0197b1e948bf55"} Mar 07 07:34:01 crc kubenswrapper[4738]: I0307 07:34:01.385410 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:34:01 crc kubenswrapper[4738]: E0307 07:34:01.385746 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.166875 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.193256 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb"] Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.202797 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb"] Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258530 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9k9mr\" (UniqueName: \"kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258605 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258627 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258680 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258780 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.258830 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts\") pod \"bc778a31-3a41-4260-92a3-c1acfba08e32\" (UID: \"bc778a31-3a41-4260-92a3-c1acfba08e32\") " Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.259499 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.260327 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.263594 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr" (OuterVolumeSpecName: "kube-api-access-9k9mr") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "kube-api-access-9k9mr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.278288 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.278321 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.290596 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts" (OuterVolumeSpecName: "scripts") pod "bc778a31-3a41-4260-92a3-c1acfba08e32" (UID: "bc778a31-3a41-4260-92a3-c1acfba08e32"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360258 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9k9mr\" (UniqueName: \"kubernetes.io/projected/bc778a31-3a41-4260-92a3-c1acfba08e32-kube-api-access-9k9mr\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360291 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc778a31-3a41-4260-92a3-c1acfba08e32-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360301 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360313 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360321 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc778a31-3a41-4260-92a3-c1acfba08e32-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.360329 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc778a31-3a41-4260-92a3-c1acfba08e32-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.392993 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc778a31-3a41-4260-92a3-c1acfba08e32" path="/var/lib/kubelet/pods/bc778a31-3a41-4260-92a3-c1acfba08e32/volumes" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.874150 4738 generic.go:334] "Generic (PLEG): container finished" podID="a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" containerID="52f757dc617b73618016cb8816cc15f3a0f3d38c50a37f78f741d5ef77340d30" exitCode=0 Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.874261 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" event={"ID":"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e","Type":"ContainerDied","Data":"52f757dc617b73618016cb8816cc15f3a0f3d38c50a37f78f741d5ef77340d30"} Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.876720 4738 scope.go:117] "RemoveContainer" containerID="577e9e7851da972d4f3b4c87c9821e29777d6577b4d2b80f3c0197b1e948bf55" Mar 07 07:34:02 crc kubenswrapper[4738]: I0307 07:34:02.876758 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vsqpb" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.406353 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg"] Mar 07 07:34:03 crc kubenswrapper[4738]: E0307 07:34:03.406993 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc778a31-3a41-4260-92a3-c1acfba08e32" containerName="swift-ring-rebalance" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.407017 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc778a31-3a41-4260-92a3-c1acfba08e32" containerName="swift-ring-rebalance" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.407383 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc778a31-3a41-4260-92a3-c1acfba08e32" containerName="swift-ring-rebalance" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.408240 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.415744 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.416070 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.423038 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg"] Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.479643 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.479715 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsgz\" (UniqueName: \"kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.479747 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.479922 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.480038 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.480207 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582091 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582189 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582250 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582299 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsgz\" (UniqueName: \"kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582322 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582385 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582929 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582997 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.582964 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.586446 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.588576 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.597705 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsgz\" (UniqueName: \"kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz\") pod \"swift-ring-rebalance-debug-t6qxg\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:03 crc kubenswrapper[4738]: I0307 07:34:03.756479 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.094983 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.194735 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6z7ml\" (UniqueName: \"kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml\") pod \"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e\" (UID: \"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e\") " Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.201394 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml" (OuterVolumeSpecName: "kube-api-access-6z7ml") pod "a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" (UID: "a6d4ad66-c41b-42f2-9b0d-3e54aa59443e"). InnerVolumeSpecName "kube-api-access-6z7ml". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.210981 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg"] Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.296728 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6z7ml\" (UniqueName: \"kubernetes.io/projected/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e-kube-api-access-6z7ml\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.904282 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.904284 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547814-wjw5r" event={"ID":"a6d4ad66-c41b-42f2-9b0d-3e54aa59443e","Type":"ContainerDied","Data":"422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84"} Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.905602 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="422691ade85b2316fd92ab9d9f9d7dced1d6efb826e9426e92d6ad7513b9ae84" Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.906667 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" event={"ID":"cd3f0cc7-e727-4a35-91b0-4553f21732b6","Type":"ContainerStarted","Data":"3cd362563b4d56f58b6239926eeb310ced33b46c2cb34634f1de4346cca22c87"} Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.906794 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" event={"ID":"cd3f0cc7-e727-4a35-91b0-4553f21732b6","Type":"ContainerStarted","Data":"a4e1f110cf8e5ad1d4ca80c3cf50d2fbe0b857bb5559958efdea4d442b187d10"} Mar 07 07:34:04 crc kubenswrapper[4738]: I0307 07:34:04.933127 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" podStartSLOduration=1.933107812 podStartE2EDuration="1.933107812s" podCreationTimestamp="2026-03-07 07:34:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:04.924901011 +0000 UTC m=+2063.389888352" watchObservedRunningTime="2026-03-07 07:34:04.933107812 +0000 UTC m=+2063.398095143" Mar 07 07:34:05 crc kubenswrapper[4738]: I0307 07:34:05.162663 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-ggx8q"] Mar 07 07:34:05 crc kubenswrapper[4738]: I0307 07:34:05.172713 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547808-ggx8q"] Mar 07 07:34:05 crc kubenswrapper[4738]: I0307 07:34:05.920199 4738 generic.go:334] "Generic (PLEG): container finished" podID="cd3f0cc7-e727-4a35-91b0-4553f21732b6" containerID="3cd362563b4d56f58b6239926eeb310ced33b46c2cb34634f1de4346cca22c87" exitCode=0 Mar 07 07:34:05 crc kubenswrapper[4738]: I0307 07:34:05.920266 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" event={"ID":"cd3f0cc7-e727-4a35-91b0-4553f21732b6","Type":"ContainerDied","Data":"3cd362563b4d56f58b6239926eeb310ced33b46c2cb34634f1de4346cca22c87"} Mar 07 07:34:06 crc kubenswrapper[4738]: I0307 07:34:06.395216 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b1cc776-2076-4e7a-9ace-ca1ca6abaa41" path="/var/lib/kubelet/pods/3b1cc776-2076-4e7a-9ace-ca1ca6abaa41/volumes" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.220911 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.256301 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg"] Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.261306 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg"] Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343411 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343525 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343554 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343580 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsgz\" (UniqueName: \"kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343617 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.343679 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift\") pod \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\" (UID: \"cd3f0cc7-e727-4a35-91b0-4553f21732b6\") " Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.344437 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.344643 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.348739 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz" (OuterVolumeSpecName: "kube-api-access-xwsgz") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "kube-api-access-xwsgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.362613 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts" (OuterVolumeSpecName: "scripts") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.363869 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.365245 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cd3f0cc7-e727-4a35-91b0-4553f21732b6" (UID: "cd3f0cc7-e727-4a35-91b0-4553f21732b6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445556 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cd3f0cc7-e727-4a35-91b0-4553f21732b6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445829 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445843 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445856 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cd3f0cc7-e727-4a35-91b0-4553f21732b6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445869 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsgz\" (UniqueName: \"kubernetes.io/projected/cd3f0cc7-e727-4a35-91b0-4553f21732b6-kube-api-access-xwsgz\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.445882 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cd3f0cc7-e727-4a35-91b0-4553f21732b6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.951772 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4e1f110cf8e5ad1d4ca80c3cf50d2fbe0b857bb5559958efdea4d442b187d10" Mar 07 07:34:07 crc kubenswrapper[4738]: I0307 07:34:07.951885 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-t6qxg" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.402296 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd3f0cc7-e727-4a35-91b0-4553f21732b6" path="/var/lib/kubelet/pods/cd3f0cc7-e727-4a35-91b0-4553f21732b6/volumes" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.490799 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt"] Mar 07 07:34:08 crc kubenswrapper[4738]: E0307 07:34:08.491356 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" containerName="oc" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.491378 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" containerName="oc" Mar 07 07:34:08 crc kubenswrapper[4738]: E0307 07:34:08.491402 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd3f0cc7-e727-4a35-91b0-4553f21732b6" containerName="swift-ring-rebalance" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.491410 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd3f0cc7-e727-4a35-91b0-4553f21732b6" containerName="swift-ring-rebalance" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.491575 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd3f0cc7-e727-4a35-91b0-4553f21732b6" containerName="swift-ring-rebalance" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.491592 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" containerName="oc" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.492166 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.499181 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.499449 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.504942 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt"] Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.564075 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.564580 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb67g\" (UniqueName: \"kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.564926 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.565217 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.565495 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.565868 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.668191 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.668336 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.668386 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vb67g\" (UniqueName: \"kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.668447 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.669206 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.669331 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.669501 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.669881 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.669887 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.674855 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.682618 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.696098 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb67g\" (UniqueName: \"kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g\") pod \"swift-ring-rebalance-debug-jkdlt\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:08 crc kubenswrapper[4738]: I0307 07:34:08.818005 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:09 crc kubenswrapper[4738]: I0307 07:34:09.258056 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt"] Mar 07 07:34:09 crc kubenswrapper[4738]: I0307 07:34:09.972608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" event={"ID":"6df03ac6-49d1-40a6-b709-ea602e6383a0","Type":"ContainerStarted","Data":"40a15b6b1eb4dec92598e78e4f7c4576fffb3157b7eb7c0f673f6212da31e05a"} Mar 07 07:34:09 crc kubenswrapper[4738]: I0307 07:34:09.972967 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" event={"ID":"6df03ac6-49d1-40a6-b709-ea602e6383a0","Type":"ContainerStarted","Data":"cd7b7e12156c0b8069925e8068f85f1cd441122b4c88f8398f2050b08a3c69ac"} Mar 07 07:34:10 crc kubenswrapper[4738]: I0307 07:34:10.011916 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" podStartSLOduration=2.011891316 podStartE2EDuration="2.011891316s" podCreationTimestamp="2026-03-07 07:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:09.997360013 +0000 UTC m=+2068.462347394" watchObservedRunningTime="2026-03-07 07:34:10.011891316 +0000 UTC m=+2068.476878647" Mar 07 07:34:10 crc kubenswrapper[4738]: I0307 07:34:10.986464 4738 generic.go:334] "Generic (PLEG): container finished" podID="6df03ac6-49d1-40a6-b709-ea602e6383a0" containerID="40a15b6b1eb4dec92598e78e4f7c4576fffb3157b7eb7c0f673f6212da31e05a" exitCode=0 Mar 07 07:34:10 crc kubenswrapper[4738]: I0307 07:34:10.986530 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" event={"ID":"6df03ac6-49d1-40a6-b709-ea602e6383a0","Type":"ContainerDied","Data":"40a15b6b1eb4dec92598e78e4f7c4576fffb3157b7eb7c0f673f6212da31e05a"} Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.348321 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.383112 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt"] Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.413616 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt"] Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.423900 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.423972 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.424041 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.424081 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.424134 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.424410 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb67g\" (UniqueName: \"kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g\") pod \"6df03ac6-49d1-40a6-b709-ea602e6383a0\" (UID: \"6df03ac6-49d1-40a6-b709-ea602e6383a0\") " Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.425621 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.426547 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:12 crc kubenswrapper[4738]: I0307 07:34:12.441923 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g" (OuterVolumeSpecName: "kube-api-access-vb67g") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "kube-api-access-vb67g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.445312 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.448308 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.456210 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts" (OuterVolumeSpecName: "scripts") pod "6df03ac6-49d1-40a6-b709-ea602e6383a0" (UID: "6df03ac6-49d1-40a6-b709-ea602e6383a0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527183 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6df03ac6-49d1-40a6-b709-ea602e6383a0-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527219 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vb67g\" (UniqueName: \"kubernetes.io/projected/6df03ac6-49d1-40a6-b709-ea602e6383a0-kube-api-access-vb67g\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527234 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527245 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6df03ac6-49d1-40a6-b709-ea602e6383a0-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527258 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:12.527270 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6df03ac6-49d1-40a6-b709-ea602e6383a0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.009988 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd7b7e12156c0b8069925e8068f85f1cd441122b4c88f8398f2050b08a3c69ac" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.010306 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jkdlt" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.532811 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-88gwc"] Mar 07 07:34:13 crc kubenswrapper[4738]: E0307 07:34:13.533200 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6df03ac6-49d1-40a6-b709-ea602e6383a0" containerName="swift-ring-rebalance" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.533216 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6df03ac6-49d1-40a6-b709-ea602e6383a0" containerName="swift-ring-rebalance" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.533371 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6df03ac6-49d1-40a6-b709-ea602e6383a0" containerName="swift-ring-rebalance" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.534144 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.544214 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-88gwc"] Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.545549 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.545722 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645659 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mfrv\" (UniqueName: \"kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645705 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645744 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645801 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645847 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.645884 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.757930 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mfrv\" (UniqueName: \"kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.758001 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.758029 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.758122 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.758199 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.758254 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.759404 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.760292 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.760318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.765423 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.766057 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.779077 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mfrv\" (UniqueName: \"kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv\") pod \"swift-ring-rebalance-debug-88gwc\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:13 crc kubenswrapper[4738]: I0307 07:34:13.878369 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:14 crc kubenswrapper[4738]: I0307 07:34:14.180676 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-88gwc"] Mar 07 07:34:14 crc kubenswrapper[4738]: I0307 07:34:14.402184 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6df03ac6-49d1-40a6-b709-ea602e6383a0" path="/var/lib/kubelet/pods/6df03ac6-49d1-40a6-b709-ea602e6383a0/volumes" Mar 07 07:34:15 crc kubenswrapper[4738]: I0307 07:34:15.031236 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" event={"ID":"08940236-2b90-4449-90ef-0a7cb8814fd7","Type":"ContainerStarted","Data":"d9d1ee8b76764a0fe674c7734b21dd1b4edda111ff8a4376b8a3a7e7b306938e"} Mar 07 07:34:15 crc kubenswrapper[4738]: I0307 07:34:15.031573 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" event={"ID":"08940236-2b90-4449-90ef-0a7cb8814fd7","Type":"ContainerStarted","Data":"b5b2ca1c70ec9df1f224a53517f6044a145d868577868f8bc4a866389c0aebb7"} Mar 07 07:34:15 crc kubenswrapper[4738]: I0307 07:34:15.052496 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" podStartSLOduration=2.052475296 podStartE2EDuration="2.052475296s" podCreationTimestamp="2026-03-07 07:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:15.049962968 +0000 UTC m=+2073.514950289" watchObservedRunningTime="2026-03-07 07:34:15.052475296 +0000 UTC m=+2073.517462627" Mar 07 07:34:15 crc kubenswrapper[4738]: I0307 07:34:15.385567 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:34:15 crc kubenswrapper[4738]: E0307 07:34:15.385845 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:34:16 crc kubenswrapper[4738]: I0307 07:34:16.040827 4738 generic.go:334] "Generic (PLEG): container finished" podID="08940236-2b90-4449-90ef-0a7cb8814fd7" containerID="d9d1ee8b76764a0fe674c7734b21dd1b4edda111ff8a4376b8a3a7e7b306938e" exitCode=0 Mar 07 07:34:16 crc kubenswrapper[4738]: I0307 07:34:16.040891 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" event={"ID":"08940236-2b90-4449-90ef-0a7cb8814fd7","Type":"ContainerDied","Data":"d9d1ee8b76764a0fe674c7734b21dd1b4edda111ff8a4376b8a3a7e7b306938e"} Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.388362 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.426826 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-88gwc"] Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.431929 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-88gwc"] Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534617 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mfrv\" (UniqueName: \"kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534681 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534723 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534781 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534821 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.534898 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf\") pod \"08940236-2b90-4449-90ef-0a7cb8814fd7\" (UID: \"08940236-2b90-4449-90ef-0a7cb8814fd7\") " Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.535889 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.536764 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.540519 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv" (OuterVolumeSpecName: "kube-api-access-8mfrv") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "kube-api-access-8mfrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.556526 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.559899 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts" (OuterVolumeSpecName: "scripts") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.561705 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "08940236-2b90-4449-90ef-0a7cb8814fd7" (UID: "08940236-2b90-4449-90ef-0a7cb8814fd7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636878 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/08940236-2b90-4449-90ef-0a7cb8814fd7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636912 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636924 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636938 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mfrv\" (UniqueName: \"kubernetes.io/projected/08940236-2b90-4449-90ef-0a7cb8814fd7-kube-api-access-8mfrv\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636951 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/08940236-2b90-4449-90ef-0a7cb8814fd7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:17 crc kubenswrapper[4738]: I0307 07:34:17.636965 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/08940236-2b90-4449-90ef-0a7cb8814fd7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.063641 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5b2ca1c70ec9df1f224a53517f6044a145d868577868f8bc4a866389c0aebb7" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.063758 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-88gwc" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.401824 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08940236-2b90-4449-90ef-0a7cb8814fd7" path="/var/lib/kubelet/pods/08940236-2b90-4449-90ef-0a7cb8814fd7/volumes" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.572007 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv"] Mar 07 07:34:18 crc kubenswrapper[4738]: E0307 07:34:18.572585 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08940236-2b90-4449-90ef-0a7cb8814fd7" containerName="swift-ring-rebalance" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.572616 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="08940236-2b90-4449-90ef-0a7cb8814fd7" containerName="swift-ring-rebalance" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.572899 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="08940236-2b90-4449-90ef-0a7cb8814fd7" containerName="swift-ring-rebalance" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.573707 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.577897 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.579075 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.582650 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv"] Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755403 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755550 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755654 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrcdh\" (UniqueName: \"kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755865 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.755984 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.857896 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.858017 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.858110 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.858235 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.858284 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrcdh\" (UniqueName: \"kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.858318 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.859797 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.859834 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.859852 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.866832 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.867356 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.879409 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrcdh\" (UniqueName: \"kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh\") pod \"swift-ring-rebalance-debug-q4vwv\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:18 crc kubenswrapper[4738]: I0307 07:34:18.903707 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:19 crc kubenswrapper[4738]: I0307 07:34:19.372981 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv"] Mar 07 07:34:19 crc kubenswrapper[4738]: W0307 07:34:19.380000 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96fdf75a_dba3_40fe_bddd_8d6c62afcf59.slice/crio-0280580e8d35bb4b2769250513cbb688f8cfd67ae3ae0dc8bec4e681f27d3b81 WatchSource:0}: Error finding container 0280580e8d35bb4b2769250513cbb688f8cfd67ae3ae0dc8bec4e681f27d3b81: Status 404 returned error can't find the container with id 0280580e8d35bb4b2769250513cbb688f8cfd67ae3ae0dc8bec4e681f27d3b81 Mar 07 07:34:20 crc kubenswrapper[4738]: I0307 07:34:20.085580 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" event={"ID":"96fdf75a-dba3-40fe-bddd-8d6c62afcf59","Type":"ContainerStarted","Data":"3ef8089f860ad2bace4896f0738fb35a7660185b99d8865471c7248248b4272f"} Mar 07 07:34:20 crc kubenswrapper[4738]: I0307 07:34:20.086029 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" event={"ID":"96fdf75a-dba3-40fe-bddd-8d6c62afcf59","Type":"ContainerStarted","Data":"0280580e8d35bb4b2769250513cbb688f8cfd67ae3ae0dc8bec4e681f27d3b81"} Mar 07 07:34:20 crc kubenswrapper[4738]: I0307 07:34:20.108593 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" podStartSLOduration=2.108569566 podStartE2EDuration="2.108569566s" podCreationTimestamp="2026-03-07 07:34:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:20.106682854 +0000 UTC m=+2078.571670205" watchObservedRunningTime="2026-03-07 07:34:20.108569566 +0000 UTC m=+2078.573556897" Mar 07 07:34:22 crc kubenswrapper[4738]: I0307 07:34:22.102178 4738 generic.go:334] "Generic (PLEG): container finished" podID="96fdf75a-dba3-40fe-bddd-8d6c62afcf59" containerID="3ef8089f860ad2bace4896f0738fb35a7660185b99d8865471c7248248b4272f" exitCode=0 Mar 07 07:34:22 crc kubenswrapper[4738]: I0307 07:34:22.102266 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" event={"ID":"96fdf75a-dba3-40fe-bddd-8d6c62afcf59","Type":"ContainerDied","Data":"3ef8089f860ad2bace4896f0738fb35a7660185b99d8865471c7248248b4272f"} Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.451003 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.481953 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv"] Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.486948 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv"] Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569236 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569306 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569418 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrcdh\" (UniqueName: \"kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569477 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569536 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.569631 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices\") pod \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\" (UID: \"96fdf75a-dba3-40fe-bddd-8d6c62afcf59\") " Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.570023 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.570461 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.570958 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.571014 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.574942 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh" (OuterVolumeSpecName: "kube-api-access-zrcdh") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "kube-api-access-zrcdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.589937 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.594767 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts" (OuterVolumeSpecName: "scripts") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.605344 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "96fdf75a-dba3-40fe-bddd-8d6c62afcf59" (UID: "96fdf75a-dba3-40fe-bddd-8d6c62afcf59"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.672295 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.672328 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrcdh\" (UniqueName: \"kubernetes.io/projected/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-kube-api-access-zrcdh\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.672340 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:23 crc kubenswrapper[4738]: I0307 07:34:23.672349 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/96fdf75a-dba3-40fe-bddd-8d6c62afcf59-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.131232 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0280580e8d35bb4b2769250513cbb688f8cfd67ae3ae0dc8bec4e681f27d3b81" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.131271 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-q4vwv" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.413633 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96fdf75a-dba3-40fe-bddd-8d6c62afcf59" path="/var/lib/kubelet/pods/96fdf75a-dba3-40fe-bddd-8d6c62afcf59/volumes" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.679486 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp"] Mar 07 07:34:24 crc kubenswrapper[4738]: E0307 07:34:24.679759 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96fdf75a-dba3-40fe-bddd-8d6c62afcf59" containerName="swift-ring-rebalance" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.679773 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="96fdf75a-dba3-40fe-bddd-8d6c62afcf59" containerName="swift-ring-rebalance" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.679896 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="96fdf75a-dba3-40fe-bddd-8d6c62afcf59" containerName="swift-ring-rebalance" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.680349 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.682727 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.685610 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.691924 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp"] Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.692920 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.693076 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.693127 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.693216 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws76z\" (UniqueName: \"kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.693261 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.693296 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794222 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws76z\" (UniqueName: \"kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794307 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794341 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794369 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794438 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794460 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.794802 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.795746 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.795845 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.798721 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.799414 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:24 crc kubenswrapper[4738]: I0307 07:34:24.817682 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws76z\" (UniqueName: \"kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z\") pod \"swift-ring-rebalance-debug-8zjlp\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:25 crc kubenswrapper[4738]: I0307 07:34:25.002907 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:25 crc kubenswrapper[4738]: I0307 07:34:25.487481 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp"] Mar 07 07:34:25 crc kubenswrapper[4738]: W0307 07:34:25.511794 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69e3a62d_d1b5_48b0_8d48_268331a2deaa.slice/crio-852ae3b814d842a13b110da9cfb8158e48b7d73eddc003b66362dc3fbd47d698 WatchSource:0}: Error finding container 852ae3b814d842a13b110da9cfb8158e48b7d73eddc003b66362dc3fbd47d698: Status 404 returned error can't find the container with id 852ae3b814d842a13b110da9cfb8158e48b7d73eddc003b66362dc3fbd47d698 Mar 07 07:34:26 crc kubenswrapper[4738]: I0307 07:34:26.148452 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" event={"ID":"69e3a62d-d1b5-48b0-8d48-268331a2deaa","Type":"ContainerStarted","Data":"5907f433b5c5166e09640ce5121a9f400f2088580f7f8e55d10611dc4060437c"} Mar 07 07:34:26 crc kubenswrapper[4738]: I0307 07:34:26.148794 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" event={"ID":"69e3a62d-d1b5-48b0-8d48-268331a2deaa","Type":"ContainerStarted","Data":"852ae3b814d842a13b110da9cfb8158e48b7d73eddc003b66362dc3fbd47d698"} Mar 07 07:34:26 crc kubenswrapper[4738]: I0307 07:34:26.174066 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" podStartSLOduration=2.174044801 podStartE2EDuration="2.174044801s" podCreationTimestamp="2026-03-07 07:34:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:26.170620358 +0000 UTC m=+2084.635607699" watchObservedRunningTime="2026-03-07 07:34:26.174044801 +0000 UTC m=+2084.639032132" Mar 07 07:34:27 crc kubenswrapper[4738]: I0307 07:34:27.385941 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:34:28 crc kubenswrapper[4738]: I0307 07:34:28.183285 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff"} Mar 07 07:34:28 crc kubenswrapper[4738]: I0307 07:34:28.186060 4738 generic.go:334] "Generic (PLEG): container finished" podID="69e3a62d-d1b5-48b0-8d48-268331a2deaa" containerID="5907f433b5c5166e09640ce5121a9f400f2088580f7f8e55d10611dc4060437c" exitCode=0 Mar 07 07:34:28 crc kubenswrapper[4738]: I0307 07:34:28.186098 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" event={"ID":"69e3a62d-d1b5-48b0-8d48-268331a2deaa","Type":"ContainerDied","Data":"5907f433b5c5166e09640ce5121a9f400f2088580f7f8e55d10611dc4060437c"} Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.466942 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.501308 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp"] Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.506797 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp"] Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666324 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws76z\" (UniqueName: \"kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666382 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666416 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666482 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666521 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.666637 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf\") pod \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\" (UID: \"69e3a62d-d1b5-48b0-8d48-268331a2deaa\") " Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.668015 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.668047 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.677284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z" (OuterVolumeSpecName: "kube-api-access-ws76z") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "kube-api-access-ws76z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.688446 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts" (OuterVolumeSpecName: "scripts") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.688741 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.692977 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "69e3a62d-d1b5-48b0-8d48-268331a2deaa" (UID: "69e3a62d-d1b5-48b0-8d48-268331a2deaa"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.768826 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws76z\" (UniqueName: \"kubernetes.io/projected/69e3a62d-d1b5-48b0-8d48-268331a2deaa-kube-api-access-ws76z\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.769119 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.769180 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69e3a62d-d1b5-48b0-8d48-268331a2deaa-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.769203 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/69e3a62d-d1b5-48b0-8d48-268331a2deaa-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.769222 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:29 crc kubenswrapper[4738]: I0307 07:34:29.769242 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/69e3a62d-d1b5-48b0-8d48-268331a2deaa-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.205466 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="852ae3b814d842a13b110da9cfb8158e48b7d73eddc003b66362dc3fbd47d698" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.205567 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8zjlp" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.395820 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69e3a62d-d1b5-48b0-8d48-268331a2deaa" path="/var/lib/kubelet/pods/69e3a62d-d1b5-48b0-8d48-268331a2deaa/volumes" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.658850 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w785r"] Mar 07 07:34:30 crc kubenswrapper[4738]: E0307 07:34:30.659536 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69e3a62d-d1b5-48b0-8d48-268331a2deaa" containerName="swift-ring-rebalance" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.659570 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="69e3a62d-d1b5-48b0-8d48-268331a2deaa" containerName="swift-ring-rebalance" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.659864 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="69e3a62d-d1b5-48b0-8d48-268331a2deaa" containerName="swift-ring-rebalance" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.660786 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.664238 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.666975 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705091 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705210 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705238 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705275 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705297 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn8nj\" (UniqueName: \"kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.705343 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.710602 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w785r"] Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806098 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806210 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806473 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806558 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806591 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn8nj\" (UniqueName: \"kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806687 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.806982 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.807075 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.807099 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.810581 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.811601 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:30 crc kubenswrapper[4738]: I0307 07:34:30.822787 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn8nj\" (UniqueName: \"kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj\") pod \"swift-ring-rebalance-debug-w785r\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:31 crc kubenswrapper[4738]: I0307 07:34:31.017700 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:31 crc kubenswrapper[4738]: I0307 07:34:31.473198 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w785r"] Mar 07 07:34:31 crc kubenswrapper[4738]: W0307 07:34:31.483387 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1c53c8b5_e2cb_4953_b964_a9b5c199c312.slice/crio-b0f40460e778ea0c8b40099d9af71794663f18405f7f7da31eccfa947f477a56 WatchSource:0}: Error finding container b0f40460e778ea0c8b40099d9af71794663f18405f7f7da31eccfa947f477a56: Status 404 returned error can't find the container with id b0f40460e778ea0c8b40099d9af71794663f18405f7f7da31eccfa947f477a56 Mar 07 07:34:32 crc kubenswrapper[4738]: I0307 07:34:32.230816 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" event={"ID":"1c53c8b5-e2cb-4953-b964-a9b5c199c312","Type":"ContainerStarted","Data":"a0036f66f349d52b02c46626e418de735262423574bd6ba21835f94fa0e65971"} Mar 07 07:34:32 crc kubenswrapper[4738]: I0307 07:34:32.231408 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" event={"ID":"1c53c8b5-e2cb-4953-b964-a9b5c199c312","Type":"ContainerStarted","Data":"b0f40460e778ea0c8b40099d9af71794663f18405f7f7da31eccfa947f477a56"} Mar 07 07:34:32 crc kubenswrapper[4738]: I0307 07:34:32.274528 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" podStartSLOduration=2.274512974 podStartE2EDuration="2.274512974s" podCreationTimestamp="2026-03-07 07:34:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:32.269511249 +0000 UTC m=+2090.734498580" watchObservedRunningTime="2026-03-07 07:34:32.274512974 +0000 UTC m=+2090.739500295" Mar 07 07:34:33 crc kubenswrapper[4738]: I0307 07:34:33.245852 4738 generic.go:334] "Generic (PLEG): container finished" podID="1c53c8b5-e2cb-4953-b964-a9b5c199c312" containerID="a0036f66f349d52b02c46626e418de735262423574bd6ba21835f94fa0e65971" exitCode=0 Mar 07 07:34:33 crc kubenswrapper[4738]: I0307 07:34:33.245918 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" event={"ID":"1c53c8b5-e2cb-4953-b964-a9b5c199c312","Type":"ContainerDied","Data":"a0036f66f349d52b02c46626e418de735262423574bd6ba21835f94fa0e65971"} Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.587409 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.622081 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w785r"] Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.628458 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w785r"] Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668466 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668587 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668631 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jn8nj\" (UniqueName: \"kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668811 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668899 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.668936 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts\") pod \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\" (UID: \"1c53c8b5-e2cb-4953-b964-a9b5c199c312\") " Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.669347 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.669532 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.673463 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj" (OuterVolumeSpecName: "kube-api-access-jn8nj") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "kube-api-access-jn8nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.688555 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts" (OuterVolumeSpecName: "scripts") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.696282 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.706041 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1c53c8b5-e2cb-4953-b964-a9b5c199c312" (UID: "1c53c8b5-e2cb-4953-b964-a9b5c199c312"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770119 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770179 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770192 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1c53c8b5-e2cb-4953-b964-a9b5c199c312-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770199 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1c53c8b5-e2cb-4953-b964-a9b5c199c312-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770208 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1c53c8b5-e2cb-4953-b964-a9b5c199c312-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:34 crc kubenswrapper[4738]: I0307 07:34:34.770216 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jn8nj\" (UniqueName: \"kubernetes.io/projected/1c53c8b5-e2cb-4953-b964-a9b5c199c312-kube-api-access-jn8nj\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.264771 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0f40460e778ea0c8b40099d9af71794663f18405f7f7da31eccfa947f477a56" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.264842 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w785r" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.819644 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj"] Mar 07 07:34:35 crc kubenswrapper[4738]: E0307 07:34:35.820140 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c53c8b5-e2cb-4953-b964-a9b5c199c312" containerName="swift-ring-rebalance" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.820218 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c53c8b5-e2cb-4953-b964-a9b5c199c312" containerName="swift-ring-rebalance" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.820568 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c53c8b5-e2cb-4953-b964-a9b5c199c312" containerName="swift-ring-rebalance" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.821634 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.827697 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.828030 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.841915 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj"] Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.888508 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.888560 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.888674 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.888780 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.889006 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pffsk\" (UniqueName: \"kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.889151 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990297 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990366 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990419 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990543 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pffsk\" (UniqueName: \"kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.990675 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.991449 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.991616 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:35 crc kubenswrapper[4738]: I0307 07:34:35.992316 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:35.996077 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:35.997091 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:36.007893 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pffsk\" (UniqueName: \"kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk\") pod \"swift-ring-rebalance-debug-gxjxj\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:36.187917 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:36.401856 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c53c8b5-e2cb-4953-b964-a9b5c199c312" path="/var/lib/kubelet/pods/1c53c8b5-e2cb-4953-b964-a9b5c199c312/volumes" Mar 07 07:34:36 crc kubenswrapper[4738]: I0307 07:34:36.405645 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj"] Mar 07 07:34:36 crc kubenswrapper[4738]: W0307 07:34:36.410506 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod830e3dbf_1811_4a2c_87a4_e91cd5b8f439.slice/crio-b5d8d24ff250a8bbf6d919d40b0db737bcfb0c89040ff6f134e22c6351db98e0 WatchSource:0}: Error finding container b5d8d24ff250a8bbf6d919d40b0db737bcfb0c89040ff6f134e22c6351db98e0: Status 404 returned error can't find the container with id b5d8d24ff250a8bbf6d919d40b0db737bcfb0c89040ff6f134e22c6351db98e0 Mar 07 07:34:37 crc kubenswrapper[4738]: I0307 07:34:37.294973 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" event={"ID":"830e3dbf-1811-4a2c-87a4-e91cd5b8f439","Type":"ContainerStarted","Data":"6f10ac55ca698b7413396ef4c7e04d5938349370aa55ae50ce9f88154b926dcb"} Mar 07 07:34:37 crc kubenswrapper[4738]: I0307 07:34:37.295376 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" event={"ID":"830e3dbf-1811-4a2c-87a4-e91cd5b8f439","Type":"ContainerStarted","Data":"b5d8d24ff250a8bbf6d919d40b0db737bcfb0c89040ff6f134e22c6351db98e0"} Mar 07 07:34:37 crc kubenswrapper[4738]: I0307 07:34:37.321435 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" podStartSLOduration=2.321416646 podStartE2EDuration="2.321416646s" podCreationTimestamp="2026-03-07 07:34:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:37.316951595 +0000 UTC m=+2095.781938926" watchObservedRunningTime="2026-03-07 07:34:37.321416646 +0000 UTC m=+2095.786403977" Mar 07 07:34:38 crc kubenswrapper[4738]: I0307 07:34:38.306126 4738 generic.go:334] "Generic (PLEG): container finished" podID="830e3dbf-1811-4a2c-87a4-e91cd5b8f439" containerID="6f10ac55ca698b7413396ef4c7e04d5938349370aa55ae50ce9f88154b926dcb" exitCode=0 Mar 07 07:34:38 crc kubenswrapper[4738]: I0307 07:34:38.306213 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" event={"ID":"830e3dbf-1811-4a2c-87a4-e91cd5b8f439","Type":"ContainerDied","Data":"6f10ac55ca698b7413396ef4c7e04d5938349370aa55ae50ce9f88154b926dcb"} Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.662229 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.698558 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj"] Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.698609 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj"] Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853136 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853288 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853357 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pffsk\" (UniqueName: \"kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853502 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853602 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.853656 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf\") pod \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\" (UID: \"830e3dbf-1811-4a2c-87a4-e91cd5b8f439\") " Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.854490 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.855278 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.859150 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk" (OuterVolumeSpecName: "kube-api-access-pffsk") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "kube-api-access-pffsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.887295 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.887392 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts" (OuterVolumeSpecName: "scripts") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.891263 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "830e3dbf-1811-4a2c-87a4-e91cd5b8f439" (UID: "830e3dbf-1811-4a2c-87a4-e91cd5b8f439"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955467 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955498 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955509 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955522 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955531 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pffsk\" (UniqueName: \"kubernetes.io/projected/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-kube-api-access-pffsk\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:39 crc kubenswrapper[4738]: I0307 07:34:39.955539 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/830e3dbf-1811-4a2c-87a4-e91cd5b8f439-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.326837 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5d8d24ff250a8bbf6d919d40b0db737bcfb0c89040ff6f134e22c6351db98e0" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.326926 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gxjxj" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.397107 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="830e3dbf-1811-4a2c-87a4-e91cd5b8f439" path="/var/lib/kubelet/pods/830e3dbf-1811-4a2c-87a4-e91cd5b8f439/volumes" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.890631 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ddw47"] Mar 07 07:34:40 crc kubenswrapper[4738]: E0307 07:34:40.891002 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="830e3dbf-1811-4a2c-87a4-e91cd5b8f439" containerName="swift-ring-rebalance" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.891020 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="830e3dbf-1811-4a2c-87a4-e91cd5b8f439" containerName="swift-ring-rebalance" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.891208 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="830e3dbf-1811-4a2c-87a4-e91cd5b8f439" containerName="swift-ring-rebalance" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.891817 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.893508 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.894943 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:40 crc kubenswrapper[4738]: I0307 07:34:40.902024 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ddw47"] Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.071135 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.072123 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.072302 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.072336 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.072411 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4mn\" (UniqueName: \"kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.072465 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173715 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173798 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173853 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173906 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173926 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.173970 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4mn\" (UniqueName: \"kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.174909 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.175438 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.176057 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.179440 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.183701 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.194780 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4mn\" (UniqueName: \"kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn\") pod \"swift-ring-rebalance-debug-ddw47\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.213435 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:41 crc kubenswrapper[4738]: I0307 07:34:41.663027 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ddw47"] Mar 07 07:34:42 crc kubenswrapper[4738]: I0307 07:34:42.350821 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" event={"ID":"5113d961-6a9a-41c8-b41a-8750a9757add","Type":"ContainerStarted","Data":"17905e1fc64d702c3f89ac471a11ca4e9424f2523386ca65ba23f2ddff9e1add"} Mar 07 07:34:42 crc kubenswrapper[4738]: I0307 07:34:42.351196 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" event={"ID":"5113d961-6a9a-41c8-b41a-8750a9757add","Type":"ContainerStarted","Data":"f6a68d8322b39c17a796e97e5f7b53b5e6ef785af38d9077d57999745ac80367"} Mar 07 07:34:42 crc kubenswrapper[4738]: I0307 07:34:42.375391 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" podStartSLOduration=2.375367128 podStartE2EDuration="2.375367128s" podCreationTimestamp="2026-03-07 07:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:42.369836159 +0000 UTC m=+2100.834823480" watchObservedRunningTime="2026-03-07 07:34:42.375367128 +0000 UTC m=+2100.840354489" Mar 07 07:34:43 crc kubenswrapper[4738]: I0307 07:34:43.361937 4738 generic.go:334] "Generic (PLEG): container finished" podID="5113d961-6a9a-41c8-b41a-8750a9757add" containerID="17905e1fc64d702c3f89ac471a11ca4e9424f2523386ca65ba23f2ddff9e1add" exitCode=0 Mar 07 07:34:43 crc kubenswrapper[4738]: I0307 07:34:43.362014 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" event={"ID":"5113d961-6a9a-41c8-b41a-8750a9757add","Type":"ContainerDied","Data":"17905e1fc64d702c3f89ac471a11ca4e9424f2523386ca65ba23f2ddff9e1add"} Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.699062 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.739020 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ddw47"] Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.740876 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.740987 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dq4mn\" (UniqueName: \"kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.741038 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.741073 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.741146 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.741328 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices\") pod \"5113d961-6a9a-41c8-b41a-8750a9757add\" (UID: \"5113d961-6a9a-41c8-b41a-8750a9757add\") " Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.742749 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.745273 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ddw47"] Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.766343 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.769083 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts" (OuterVolumeSpecName: "scripts") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.770275 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn" (OuterVolumeSpecName: "kube-api-access-dq4mn") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "kube-api-access-dq4mn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.793174 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.797421 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5113d961-6a9a-41c8-b41a-8750a9757add" (UID: "5113d961-6a9a-41c8-b41a-8750a9757add"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843142 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843197 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5113d961-6a9a-41c8-b41a-8750a9757add-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843210 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5113d961-6a9a-41c8-b41a-8750a9757add-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843222 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843234 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5113d961-6a9a-41c8-b41a-8750a9757add-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:44 crc kubenswrapper[4738]: I0307 07:34:44.843246 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dq4mn\" (UniqueName: \"kubernetes.io/projected/5113d961-6a9a-41c8-b41a-8750a9757add-kube-api-access-dq4mn\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.379869 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6a68d8322b39c17a796e97e5f7b53b5e6ef785af38d9077d57999745ac80367" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.380263 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ddw47" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.890743 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b"] Mar 07 07:34:45 crc kubenswrapper[4738]: E0307 07:34:45.891741 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5113d961-6a9a-41c8-b41a-8750a9757add" containerName="swift-ring-rebalance" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.891774 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5113d961-6a9a-41c8-b41a-8750a9757add" containerName="swift-ring-rebalance" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.892128 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5113d961-6a9a-41c8-b41a-8750a9757add" containerName="swift-ring-rebalance" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.893250 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.903190 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b"] Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.909146 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.909388 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960182 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960411 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960454 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960495 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960539 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:45 crc kubenswrapper[4738]: I0307 07:34:45.960737 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz8wd\" (UniqueName: \"kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.071703 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.071939 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072232 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072379 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072534 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wz8wd\" (UniqueName: \"kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072670 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072766 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.072846 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.073124 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.082264 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.082706 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.094568 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz8wd\" (UniqueName: \"kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd\") pod \"swift-ring-rebalance-debug-ldg9b\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.134022 4738 scope.go:117] "RemoveContainer" containerID="7918b0bfc4e17a97adc26916733e2f5bee3ee3c2589e235b72fc6e8f46c6fbcc" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.235896 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.406102 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5113d961-6a9a-41c8-b41a-8750a9757add" path="/var/lib/kubelet/pods/5113d961-6a9a-41c8-b41a-8750a9757add/volumes" Mar 07 07:34:46 crc kubenswrapper[4738]: I0307 07:34:46.695249 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b"] Mar 07 07:34:47 crc kubenswrapper[4738]: I0307 07:34:47.402095 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" event={"ID":"0521fb12-1237-4846-b0c7-a1a26f5d17cb","Type":"ContainerStarted","Data":"5779b2d284d0aa6027d3dc6627f5da49255dc59b40064b64a27cea8d8cd31a35"} Mar 07 07:34:47 crc kubenswrapper[4738]: I0307 07:34:47.402554 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" event={"ID":"0521fb12-1237-4846-b0c7-a1a26f5d17cb","Type":"ContainerStarted","Data":"ead88b7d4b8040698d67bf54ff01f48bbd8773e515ff8a9ec31edc689308f448"} Mar 07 07:34:47 crc kubenswrapper[4738]: I0307 07:34:47.424574 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" podStartSLOduration=2.424557101 podStartE2EDuration="2.424557101s" podCreationTimestamp="2026-03-07 07:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:47.419194286 +0000 UTC m=+2105.884181607" watchObservedRunningTime="2026-03-07 07:34:47.424557101 +0000 UTC m=+2105.889544422" Mar 07 07:34:49 crc kubenswrapper[4738]: I0307 07:34:49.420444 4738 generic.go:334] "Generic (PLEG): container finished" podID="0521fb12-1237-4846-b0c7-a1a26f5d17cb" containerID="5779b2d284d0aa6027d3dc6627f5da49255dc59b40064b64a27cea8d8cd31a35" exitCode=0 Mar 07 07:34:49 crc kubenswrapper[4738]: I0307 07:34:49.420641 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" event={"ID":"0521fb12-1237-4846-b0c7-a1a26f5d17cb","Type":"ContainerDied","Data":"5779b2d284d0aa6027d3dc6627f5da49255dc59b40064b64a27cea8d8cd31a35"} Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.693117 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745476 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz8wd\" (UniqueName: \"kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745564 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745643 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745708 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745752 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.745828 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts\") pod \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\" (UID: \"0521fb12-1237-4846-b0c7-a1a26f5d17cb\") " Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.747568 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.748410 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.748463 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b"] Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.752133 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd" (OuterVolumeSpecName: "kube-api-access-wz8wd") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "kube-api-access-wz8wd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.759389 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b"] Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.777685 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts" (OuterVolumeSpecName: "scripts") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.783269 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.785297 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0521fb12-1237-4846-b0c7-a1a26f5d17cb" (UID: "0521fb12-1237-4846-b0c7-a1a26f5d17cb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848004 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848114 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0521fb12-1237-4846-b0c7-a1a26f5d17cb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848177 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848191 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wz8wd\" (UniqueName: \"kubernetes.io/projected/0521fb12-1237-4846-b0c7-a1a26f5d17cb-kube-api-access-wz8wd\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848206 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0521fb12-1237-4846-b0c7-a1a26f5d17cb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:50 crc kubenswrapper[4738]: I0307 07:34:50.848217 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0521fb12-1237-4846-b0c7-a1a26f5d17cb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.441693 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ead88b7d4b8040698d67bf54ff01f48bbd8773e515ff8a9ec31edc689308f448" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.441742 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ldg9b" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.904915 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6"] Mar 07 07:34:51 crc kubenswrapper[4738]: E0307 07:34:51.905356 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0521fb12-1237-4846-b0c7-a1a26f5d17cb" containerName="swift-ring-rebalance" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.905375 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0521fb12-1237-4846-b0c7-a1a26f5d17cb" containerName="swift-ring-rebalance" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.905563 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0521fb12-1237-4846-b0c7-a1a26f5d17cb" containerName="swift-ring-rebalance" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.906248 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.908560 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.908827 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.915334 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6"] Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965352 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965404 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965432 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965498 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2p6\" (UniqueName: \"kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965526 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:51 crc kubenswrapper[4738]: I0307 07:34:51.965561 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066413 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066468 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066496 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066519 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066536 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2p6\" (UniqueName: \"kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.066571 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.068126 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.068182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.068552 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.074007 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.074942 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.089657 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2p6\" (UniqueName: \"kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6\") pod \"swift-ring-rebalance-debug-jgwz6\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.257534 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.406262 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0521fb12-1237-4846-b0c7-a1a26f5d17cb" path="/var/lib/kubelet/pods/0521fb12-1237-4846-b0c7-a1a26f5d17cb/volumes" Mar 07 07:34:52 crc kubenswrapper[4738]: I0307 07:34:52.678890 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6"] Mar 07 07:34:53 crc kubenswrapper[4738]: I0307 07:34:53.457883 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" event={"ID":"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c","Type":"ContainerStarted","Data":"ef2e0b73ba9d2121bec9c73242680d8111aeadde443277fed2de4c94beada1a6"} Mar 07 07:34:53 crc kubenswrapper[4738]: I0307 07:34:53.458223 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" event={"ID":"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c","Type":"ContainerStarted","Data":"b8d4c8d0e36e1a5349bf2c16312435cbb56863d3ca2a73b05d355ad5a1f922e5"} Mar 07 07:34:53 crc kubenswrapper[4738]: I0307 07:34:53.475506 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" podStartSLOduration=2.475492753 podStartE2EDuration="2.475492753s" podCreationTimestamp="2026-03-07 07:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:53.473173941 +0000 UTC m=+2111.938161262" watchObservedRunningTime="2026-03-07 07:34:53.475492753 +0000 UTC m=+2111.940480074" Mar 07 07:34:54 crc kubenswrapper[4738]: I0307 07:34:54.467305 4738 generic.go:334] "Generic (PLEG): container finished" podID="1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" containerID="ef2e0b73ba9d2121bec9c73242680d8111aeadde443277fed2de4c94beada1a6" exitCode=0 Mar 07 07:34:54 crc kubenswrapper[4738]: I0307 07:34:54.467408 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" event={"ID":"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c","Type":"ContainerDied","Data":"ef2e0b73ba9d2121bec9c73242680d8111aeadde443277fed2de4c94beada1a6"} Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.790680 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.822558 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.822602 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.822674 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2p6\" (UniqueName: \"kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.822708 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823427 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823452 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf\") pod \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\" (UID: \"1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c\") " Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823419 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823619 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823936 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.823954 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.832374 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6"] Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.832558 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6" (OuterVolumeSpecName: "kube-api-access-sv2p6") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "kube-api-access-sv2p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.842263 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6"] Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.854288 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.858676 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts" (OuterVolumeSpecName: "scripts") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.868252 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" (UID: "1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.925473 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.925520 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2p6\" (UniqueName: \"kubernetes.io/projected/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-kube-api-access-sv2p6\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.925539 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:55 crc kubenswrapper[4738]: I0307 07:34:55.925552 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:34:56 crc kubenswrapper[4738]: I0307 07:34:56.394537 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" path="/var/lib/kubelet/pods/1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c/volumes" Mar 07 07:34:56 crc kubenswrapper[4738]: I0307 07:34:56.484302 4738 scope.go:117] "RemoveContainer" containerID="ef2e0b73ba9d2121bec9c73242680d8111aeadde443277fed2de4c94beada1a6" Mar 07 07:34:56 crc kubenswrapper[4738]: I0307 07:34:56.484364 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jgwz6" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.035009 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qcvst"] Mar 07 07:34:57 crc kubenswrapper[4738]: E0307 07:34:57.035440 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" containerName="swift-ring-rebalance" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.035454 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" containerName="swift-ring-rebalance" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.035648 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cd8ccc4-dfce-48ac-ab27-615f8b4c4c2c" containerName="swift-ring-rebalance" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.036315 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.042179 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.042391 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.055866 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qcvst"] Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.143392 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.143625 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.143706 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.144044 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcmlk\" (UniqueName: \"kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.144115 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.144353 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.245787 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.245919 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcmlk\" (UniqueName: \"kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.245950 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.245998 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.246034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.246057 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.246943 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.247076 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.247200 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.251075 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.251214 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.266902 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcmlk\" (UniqueName: \"kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk\") pod \"swift-ring-rebalance-debug-qcvst\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.353642 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:34:57 crc kubenswrapper[4738]: I0307 07:34:57.781025 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qcvst"] Mar 07 07:34:58 crc kubenswrapper[4738]: I0307 07:34:58.502961 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" event={"ID":"7f322057-2b13-4ddc-b04f-596939060527","Type":"ContainerStarted","Data":"925e30982c4842bce8bc171832f5a644fba761046f8d33da5ce5707e29174ca1"} Mar 07 07:34:58 crc kubenswrapper[4738]: I0307 07:34:58.503298 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" event={"ID":"7f322057-2b13-4ddc-b04f-596939060527","Type":"ContainerStarted","Data":"b04c5f719616eb67e5184efe408c9d9fa127f9bdbe77bed9e127f4654c6b7379"} Mar 07 07:34:58 crc kubenswrapper[4738]: I0307 07:34:58.527214 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" podStartSLOduration=1.527198184 podStartE2EDuration="1.527198184s" podCreationTimestamp="2026-03-07 07:34:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:34:58.525742334 +0000 UTC m=+2116.990729655" watchObservedRunningTime="2026-03-07 07:34:58.527198184 +0000 UTC m=+2116.992185505" Mar 07 07:34:59 crc kubenswrapper[4738]: I0307 07:34:59.511485 4738 generic.go:334] "Generic (PLEG): container finished" podID="7f322057-2b13-4ddc-b04f-596939060527" containerID="925e30982c4842bce8bc171832f5a644fba761046f8d33da5ce5707e29174ca1" exitCode=0 Mar 07 07:34:59 crc kubenswrapper[4738]: I0307 07:34:59.511528 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" event={"ID":"7f322057-2b13-4ddc-b04f-596939060527","Type":"ContainerDied","Data":"925e30982c4842bce8bc171832f5a644fba761046f8d33da5ce5707e29174ca1"} Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.832643 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.871707 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qcvst"] Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.880359 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qcvst"] Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.900801 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.900881 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.900906 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.900987 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.901036 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.901070 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcmlk\" (UniqueName: \"kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk\") pod \"7f322057-2b13-4ddc-b04f-596939060527\" (UID: \"7f322057-2b13-4ddc-b04f-596939060527\") " Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.901813 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.902340 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.906638 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk" (OuterVolumeSpecName: "kube-api-access-fcmlk") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "kube-api-access-fcmlk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.922102 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.927242 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:00 crc kubenswrapper[4738]: I0307 07:35:00.935402 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts" (OuterVolumeSpecName: "scripts") pod "7f322057-2b13-4ddc-b04f-596939060527" (UID: "7f322057-2b13-4ddc-b04f-596939060527"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003373 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003614 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003709 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f322057-2b13-4ddc-b04f-596939060527-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003767 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f322057-2b13-4ddc-b04f-596939060527-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003818 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f322057-2b13-4ddc-b04f-596939060527-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.003870 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcmlk\" (UniqueName: \"kubernetes.io/projected/7f322057-2b13-4ddc-b04f-596939060527-kube-api-access-fcmlk\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.534196 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qcvst" Mar 07 07:35:01 crc kubenswrapper[4738]: I0307 07:35:01.534064 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b04c5f719616eb67e5184efe408c9d9fa127f9bdbe77bed9e127f4654c6b7379" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.044888 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv"] Mar 07 07:35:02 crc kubenswrapper[4738]: E0307 07:35:02.045267 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f322057-2b13-4ddc-b04f-596939060527" containerName="swift-ring-rebalance" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.045284 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f322057-2b13-4ddc-b04f-596939060527" containerName="swift-ring-rebalance" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.045448 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f322057-2b13-4ddc-b04f-596939060527" containerName="swift-ring-rebalance" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.046009 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.050217 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.050241 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.076004 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv"] Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121284 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121322 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121357 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121421 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4g5k\" (UniqueName: \"kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121465 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.121486 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222511 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g4g5k\" (UniqueName: \"kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222650 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222691 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222722 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.222764 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.223963 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.224219 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.224451 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.230913 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.232463 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.244251 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4g5k\" (UniqueName: \"kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k\") pod \"swift-ring-rebalance-debug-9cdqv\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.389599 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.398903 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f322057-2b13-4ddc-b04f-596939060527" path="/var/lib/kubelet/pods/7f322057-2b13-4ddc-b04f-596939060527/volumes" Mar 07 07:35:02 crc kubenswrapper[4738]: W0307 07:35:02.827585 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4403867_0475_4c66_9f5f_1db307a2d2b8.slice/crio-d87127ec91e1918c9331a393727e16f310a264eccaa63ec58eead0133e7f28ca WatchSource:0}: Error finding container d87127ec91e1918c9331a393727e16f310a264eccaa63ec58eead0133e7f28ca: Status 404 returned error can't find the container with id d87127ec91e1918c9331a393727e16f310a264eccaa63ec58eead0133e7f28ca Mar 07 07:35:02 crc kubenswrapper[4738]: I0307 07:35:02.832065 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv"] Mar 07 07:35:03 crc kubenswrapper[4738]: I0307 07:35:03.565065 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" event={"ID":"f4403867-0475-4c66-9f5f-1db307a2d2b8","Type":"ContainerStarted","Data":"d9f0a20341f87a9053ab7c27401a90a00ba694c9b65b0f4681aa708e8b4ab077"} Mar 07 07:35:03 crc kubenswrapper[4738]: I0307 07:35:03.565524 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" event={"ID":"f4403867-0475-4c66-9f5f-1db307a2d2b8","Type":"ContainerStarted","Data":"d87127ec91e1918c9331a393727e16f310a264eccaa63ec58eead0133e7f28ca"} Mar 07 07:35:04 crc kubenswrapper[4738]: I0307 07:35:04.589340 4738 generic.go:334] "Generic (PLEG): container finished" podID="f4403867-0475-4c66-9f5f-1db307a2d2b8" containerID="d9f0a20341f87a9053ab7c27401a90a00ba694c9b65b0f4681aa708e8b4ab077" exitCode=0 Mar 07 07:35:04 crc kubenswrapper[4738]: I0307 07:35:04.589455 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" event={"ID":"f4403867-0475-4c66-9f5f-1db307a2d2b8","Type":"ContainerDied","Data":"d9f0a20341f87a9053ab7c27401a90a00ba694c9b65b0f4681aa708e8b4ab077"} Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.891531 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.941316 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv"] Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.950919 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv"] Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.989635 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.989762 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.989868 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.989969 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g4g5k\" (UniqueName: \"kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.990016 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.990052 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices\") pod \"f4403867-0475-4c66-9f5f-1db307a2d2b8\" (UID: \"f4403867-0475-4c66-9f5f-1db307a2d2b8\") " Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.991214 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:05 crc kubenswrapper[4738]: I0307 07:35:05.991334 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.012216 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts" (OuterVolumeSpecName: "scripts") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.016768 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.020358 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k" (OuterVolumeSpecName: "kube-api-access-g4g5k") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "kube-api-access-g4g5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.024799 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f4403867-0475-4c66-9f5f-1db307a2d2b8" (UID: "f4403867-0475-4c66-9f5f-1db307a2d2b8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093512 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f4403867-0475-4c66-9f5f-1db307a2d2b8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093568 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g4g5k\" (UniqueName: \"kubernetes.io/projected/f4403867-0475-4c66-9f5f-1db307a2d2b8-kube-api-access-g4g5k\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093587 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093600 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f4403867-0475-4c66-9f5f-1db307a2d2b8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093614 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.093627 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f4403867-0475-4c66-9f5f-1db307a2d2b8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.400915 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4403867-0475-4c66-9f5f-1db307a2d2b8" path="/var/lib/kubelet/pods/f4403867-0475-4c66-9f5f-1db307a2d2b8/volumes" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.608695 4738 scope.go:117] "RemoveContainer" containerID="d9f0a20341f87a9053ab7c27401a90a00ba694c9b65b0f4681aa708e8b4ab077" Mar 07 07:35:06 crc kubenswrapper[4738]: I0307 07:35:06.608768 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9cdqv" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.139627 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-njtk5"] Mar 07 07:35:07 crc kubenswrapper[4738]: E0307 07:35:07.139971 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4403867-0475-4c66-9f5f-1db307a2d2b8" containerName="swift-ring-rebalance" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.139989 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4403867-0475-4c66-9f5f-1db307a2d2b8" containerName="swift-ring-rebalance" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.140276 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4403867-0475-4c66-9f5f-1db307a2d2b8" containerName="swift-ring-rebalance" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.140894 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.143789 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.145696 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.151618 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-njtk5"] Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.211758 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.212132 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.212227 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.212301 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.212333 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.212420 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbql2\" (UniqueName: \"kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.314311 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.314428 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.314608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbql2\" (UniqueName: \"kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.314878 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.315449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.315582 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.315734 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.316706 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.317844 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.319997 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.320355 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.331959 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbql2\" (UniqueName: \"kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2\") pod \"swift-ring-rebalance-debug-njtk5\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.465054 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:07 crc kubenswrapper[4738]: I0307 07:35:07.738545 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-njtk5"] Mar 07 07:35:07 crc kubenswrapper[4738]: W0307 07:35:07.739512 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8447956_44d5_451a_94c0_eeeb56609808.slice/crio-2135e9da4e53ffb2246ca2a253fe8083e9f6b1605c8ffaf229ac035d87ad4b65 WatchSource:0}: Error finding container 2135e9da4e53ffb2246ca2a253fe8083e9f6b1605c8ffaf229ac035d87ad4b65: Status 404 returned error can't find the container with id 2135e9da4e53ffb2246ca2a253fe8083e9f6b1605c8ffaf229ac035d87ad4b65 Mar 07 07:35:08 crc kubenswrapper[4738]: I0307 07:35:08.633372 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" event={"ID":"d8447956-44d5-451a-94c0-eeeb56609808","Type":"ContainerStarted","Data":"4865152f084910d04b52fcf18797127b471c68b10683c54cd7848eb6e738b288"} Mar 07 07:35:08 crc kubenswrapper[4738]: I0307 07:35:08.633850 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" event={"ID":"d8447956-44d5-451a-94c0-eeeb56609808","Type":"ContainerStarted","Data":"2135e9da4e53ffb2246ca2a253fe8083e9f6b1605c8ffaf229ac035d87ad4b65"} Mar 07 07:35:08 crc kubenswrapper[4738]: I0307 07:35:08.653964 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" podStartSLOduration=1.653945927 podStartE2EDuration="1.653945927s" podCreationTimestamp="2026-03-07 07:35:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:08.649187578 +0000 UTC m=+2127.114174939" watchObservedRunningTime="2026-03-07 07:35:08.653945927 +0000 UTC m=+2127.118933248" Mar 07 07:35:09 crc kubenswrapper[4738]: I0307 07:35:09.647802 4738 generic.go:334] "Generic (PLEG): container finished" podID="d8447956-44d5-451a-94c0-eeeb56609808" containerID="4865152f084910d04b52fcf18797127b471c68b10683c54cd7848eb6e738b288" exitCode=0 Mar 07 07:35:09 crc kubenswrapper[4738]: I0307 07:35:09.647874 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" event={"ID":"d8447956-44d5-451a-94c0-eeeb56609808","Type":"ContainerDied","Data":"4865152f084910d04b52fcf18797127b471c68b10683c54cd7848eb6e738b288"} Mar 07 07:35:10 crc kubenswrapper[4738]: I0307 07:35:10.964459 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.002447 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-njtk5"] Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.015143 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-njtk5"] Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.081508 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.081580 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.081637 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbql2\" (UniqueName: \"kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.081693 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.082571 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.082725 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.082915 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices\") pod \"d8447956-44d5-451a-94c0-eeeb56609808\" (UID: \"d8447956-44d5-451a-94c0-eeeb56609808\") " Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.083684 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.084480 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.084533 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d8447956-44d5-451a-94c0-eeeb56609808-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.089249 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2" (OuterVolumeSpecName: "kube-api-access-fbql2") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "kube-api-access-fbql2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.110634 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.116875 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts" (OuterVolumeSpecName: "scripts") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.118523 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d8447956-44d5-451a-94c0-eeeb56609808" (UID: "d8447956-44d5-451a-94c0-eeeb56609808"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.185440 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.185502 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d8447956-44d5-451a-94c0-eeeb56609808-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.185517 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbql2\" (UniqueName: \"kubernetes.io/projected/d8447956-44d5-451a-94c0-eeeb56609808-kube-api-access-fbql2\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.185531 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d8447956-44d5-451a-94c0-eeeb56609808-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.668611 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2135e9da4e53ffb2246ca2a253fe8083e9f6b1605c8ffaf229ac035d87ad4b65" Mar 07 07:35:11 crc kubenswrapper[4738]: I0307 07:35:11.668653 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-njtk5" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.190406 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt64j"] Mar 07 07:35:12 crc kubenswrapper[4738]: E0307 07:35:12.190846 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8447956-44d5-451a-94c0-eeeb56609808" containerName="swift-ring-rebalance" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.190868 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8447956-44d5-451a-94c0-eeeb56609808" containerName="swift-ring-rebalance" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.191198 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8447956-44d5-451a-94c0-eeeb56609808" containerName="swift-ring-rebalance" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.192200 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.194592 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.194696 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.199177 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt64j"] Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.200818 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.200893 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.200936 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.200991 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.201181 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzv7\" (UniqueName: \"kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.201307 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302239 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302290 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302306 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302326 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302683 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302755 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzv7\" (UniqueName: \"kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.302870 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.303480 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.303653 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.306491 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.310047 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.323242 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzv7\" (UniqueName: \"kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7\") pod \"swift-ring-rebalance-debug-mt64j\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.394520 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8447956-44d5-451a-94c0-eeeb56609808" path="/var/lib/kubelet/pods/d8447956-44d5-451a-94c0-eeeb56609808/volumes" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.511117 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:12 crc kubenswrapper[4738]: I0307 07:35:12.922887 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt64j"] Mar 07 07:35:13 crc kubenswrapper[4738]: I0307 07:35:13.687692 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" event={"ID":"7b967a03-47e9-45d0-9b76-c36571b2851c","Type":"ContainerStarted","Data":"8611f312ff2735a746f2207c556ecfb073909fadad9957abb2b5f120d59f74c4"} Mar 07 07:35:13 crc kubenswrapper[4738]: I0307 07:35:13.687760 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" event={"ID":"7b967a03-47e9-45d0-9b76-c36571b2851c","Type":"ContainerStarted","Data":"f106c77a00dcaf02c6771bd339c829fa1949309ea098d993d90aa93ce49f308e"} Mar 07 07:35:13 crc kubenswrapper[4738]: I0307 07:35:13.715293 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" podStartSLOduration=1.715268779 podStartE2EDuration="1.715268779s" podCreationTimestamp="2026-03-07 07:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:13.704367214 +0000 UTC m=+2132.169354555" watchObservedRunningTime="2026-03-07 07:35:13.715268779 +0000 UTC m=+2132.180256120" Mar 07 07:35:14 crc kubenswrapper[4738]: I0307 07:35:14.709699 4738 generic.go:334] "Generic (PLEG): container finished" podID="7b967a03-47e9-45d0-9b76-c36571b2851c" containerID="8611f312ff2735a746f2207c556ecfb073909fadad9957abb2b5f120d59f74c4" exitCode=0 Mar 07 07:35:14 crc kubenswrapper[4738]: I0307 07:35:14.709757 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" event={"ID":"7b967a03-47e9-45d0-9b76-c36571b2851c","Type":"ContainerDied","Data":"8611f312ff2735a746f2207c556ecfb073909fadad9957abb2b5f120d59f74c4"} Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.058760 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060593 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060657 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060717 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060748 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzv7\" (UniqueName: \"kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060789 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.060916 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices\") pod \"7b967a03-47e9-45d0-9b76-c36571b2851c\" (UID: \"7b967a03-47e9-45d0-9b76-c36571b2851c\") " Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.061559 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.061554 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.074765 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7" (OuterVolumeSpecName: "kube-api-access-mdzv7") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "kube-api-access-mdzv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.093855 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.098801 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts" (OuterVolumeSpecName: "scripts") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.104477 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt64j"] Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.110707 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mt64j"] Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.114443 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7b967a03-47e9-45d0-9b76-c36571b2851c" (UID: "7b967a03-47e9-45d0-9b76-c36571b2851c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.161977 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.162014 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzv7\" (UniqueName: \"kubernetes.io/projected/7b967a03-47e9-45d0-9b76-c36571b2851c-kube-api-access-mdzv7\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.162026 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b967a03-47e9-45d0-9b76-c36571b2851c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.162034 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.162042 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b967a03-47e9-45d0-9b76-c36571b2851c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.162053 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b967a03-47e9-45d0-9b76-c36571b2851c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.403803 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b967a03-47e9-45d0-9b76-c36571b2851c" path="/var/lib/kubelet/pods/7b967a03-47e9-45d0-9b76-c36571b2851c/volumes" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.730030 4738 scope.go:117] "RemoveContainer" containerID="8611f312ff2735a746f2207c556ecfb073909fadad9957abb2b5f120d59f74c4" Mar 07 07:35:16 crc kubenswrapper[4738]: I0307 07:35:16.730580 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mt64j" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.253184 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9"] Mar 07 07:35:17 crc kubenswrapper[4738]: E0307 07:35:17.253661 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b967a03-47e9-45d0-9b76-c36571b2851c" containerName="swift-ring-rebalance" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.253683 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b967a03-47e9-45d0-9b76-c36571b2851c" containerName="swift-ring-rebalance" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.254017 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b967a03-47e9-45d0-9b76-c36571b2851c" containerName="swift-ring-rebalance" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.254802 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.258123 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.258197 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.264537 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9"] Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.378143 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.378264 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.378746 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rldh\" (UniqueName: \"kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.378875 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.379065 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.379173 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.480589 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.480736 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.480868 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.481179 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.481229 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.481268 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rldh\" (UniqueName: \"kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.482124 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.482859 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.482965 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.487753 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.489074 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.514016 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rldh\" (UniqueName: \"kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh\") pod \"swift-ring-rebalance-debug-9gzm9\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.578456 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:17 crc kubenswrapper[4738]: I0307 07:35:17.821694 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9"] Mar 07 07:35:18 crc kubenswrapper[4738]: I0307 07:35:18.754800 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" event={"ID":"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639","Type":"ContainerStarted","Data":"0c7d7cf72c7859650d11474f855544943bd4f58c0a1500d8f5204aadc80a106a"} Mar 07 07:35:18 crc kubenswrapper[4738]: I0307 07:35:18.755083 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" event={"ID":"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639","Type":"ContainerStarted","Data":"e2aebf7e9f82ae061af70ceba04f55caf2e6e358e4b4e6d4cbcb6e257ac3414e"} Mar 07 07:35:18 crc kubenswrapper[4738]: I0307 07:35:18.804897 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" podStartSLOduration=1.804870177 podStartE2EDuration="1.804870177s" podCreationTimestamp="2026-03-07 07:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:18.79134773 +0000 UTC m=+2137.256335091" watchObservedRunningTime="2026-03-07 07:35:18.804870177 +0000 UTC m=+2137.269857538" Mar 07 07:35:19 crc kubenswrapper[4738]: I0307 07:35:19.769993 4738 generic.go:334] "Generic (PLEG): container finished" podID="4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" containerID="0c7d7cf72c7859650d11474f855544943bd4f58c0a1500d8f5204aadc80a106a" exitCode=0 Mar 07 07:35:19 crc kubenswrapper[4738]: I0307 07:35:19.770059 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" event={"ID":"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639","Type":"ContainerDied","Data":"0c7d7cf72c7859650d11474f855544943bd4f58c0a1500d8f5204aadc80a106a"} Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.140839 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148434 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148507 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148541 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148570 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148672 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.148758 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rldh\" (UniqueName: \"kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh\") pod \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\" (UID: \"4ddb1d44-d4f9-42b6-899a-f3fe2b34b639\") " Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.149985 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.150481 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.154232 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh" (OuterVolumeSpecName: "kube-api-access-2rldh") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "kube-api-access-2rldh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.177713 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.182107 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts" (OuterVolumeSpecName: "scripts") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.187345 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" (UID: "4ddb1d44-d4f9-42b6-899a-f3fe2b34b639"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.188603 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9"] Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.200468 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9"] Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250257 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rldh\" (UniqueName: \"kubernetes.io/projected/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-kube-api-access-2rldh\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250289 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250299 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250308 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250318 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.250326 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.793107 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2aebf7e9f82ae061af70ceba04f55caf2e6e358e4b4e6d4cbcb6e257ac3414e" Mar 07 07:35:21 crc kubenswrapper[4738]: I0307 07:35:21.793349 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9gzm9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.307512 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9"] Mar 07 07:35:22 crc kubenswrapper[4738]: E0307 07:35:22.307851 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" containerName="swift-ring-rebalance" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.307863 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" containerName="swift-ring-rebalance" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.308023 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" containerName="swift-ring-rebalance" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.308516 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.311002 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.311275 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.340177 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9"] Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.371109 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.374498 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.374648 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.374989 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.375949 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.376048 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z99j2\" (UniqueName: \"kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.399429 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ddb1d44-d4f9-42b6-899a-f3fe2b34b639" path="/var/lib/kubelet/pods/4ddb1d44-d4f9-42b6-899a-f3fe2b34b639/volumes" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.478547 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.478650 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z99j2\" (UniqueName: \"kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.478870 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.478917 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.479014 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.479071 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.479691 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.480116 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.480904 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.484896 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.485045 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.500085 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z99j2\" (UniqueName: \"kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2\") pod \"swift-ring-rebalance-debug-kn5q9\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:22 crc kubenswrapper[4738]: I0307 07:35:22.638677 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:23 crc kubenswrapper[4738]: I0307 07:35:23.044288 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9"] Mar 07 07:35:23 crc kubenswrapper[4738]: W0307 07:35:23.053361 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c800294_72e6_443c_aba4_257275de39a6.slice/crio-ba5f14d740012e906003c2bbc0d8306a85b8436df8eddd8e715b34c49166f765 WatchSource:0}: Error finding container ba5f14d740012e906003c2bbc0d8306a85b8436df8eddd8e715b34c49166f765: Status 404 returned error can't find the container with id ba5f14d740012e906003c2bbc0d8306a85b8436df8eddd8e715b34c49166f765 Mar 07 07:35:23 crc kubenswrapper[4738]: I0307 07:35:23.814195 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" event={"ID":"3c800294-72e6-443c-aba4-257275de39a6","Type":"ContainerStarted","Data":"b10d32c08220340dfb8b86782fd6e199c81bd8f2a146118ddea9c49be4aeddf1"} Mar 07 07:35:23 crc kubenswrapper[4738]: I0307 07:35:23.814256 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" event={"ID":"3c800294-72e6-443c-aba4-257275de39a6","Type":"ContainerStarted","Data":"ba5f14d740012e906003c2bbc0d8306a85b8436df8eddd8e715b34c49166f765"} Mar 07 07:35:23 crc kubenswrapper[4738]: I0307 07:35:23.829644 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" podStartSLOduration=1.8296174170000001 podStartE2EDuration="1.829617417s" podCreationTimestamp="2026-03-07 07:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:23.826944395 +0000 UTC m=+2142.291931746" watchObservedRunningTime="2026-03-07 07:35:23.829617417 +0000 UTC m=+2142.294604778" Mar 07 07:35:24 crc kubenswrapper[4738]: I0307 07:35:24.830043 4738 generic.go:334] "Generic (PLEG): container finished" podID="3c800294-72e6-443c-aba4-257275de39a6" containerID="b10d32c08220340dfb8b86782fd6e199c81bd8f2a146118ddea9c49be4aeddf1" exitCode=0 Mar 07 07:35:24 crc kubenswrapper[4738]: I0307 07:35:24.830214 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" event={"ID":"3c800294-72e6-443c-aba4-257275de39a6","Type":"ContainerDied","Data":"b10d32c08220340dfb8b86782fd6e199c81bd8f2a146118ddea9c49be4aeddf1"} Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.204181 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.237547 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z99j2\" (UniqueName: \"kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.237794 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.237901 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.238006 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.238100 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.238599 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices\") pod \"3c800294-72e6-443c-aba4-257275de39a6\" (UID: \"3c800294-72e6-443c-aba4-257275de39a6\") " Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.238691 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.239089 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.239310 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3c800294-72e6-443c-aba4-257275de39a6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.239328 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.252696 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2" (OuterVolumeSpecName: "kube-api-access-z99j2") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "kube-api-access-z99j2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.259836 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9"] Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.262097 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts" (OuterVolumeSpecName: "scripts") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.269852 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9"] Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.287006 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.291535 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3c800294-72e6-443c-aba4-257275de39a6" (UID: "3c800294-72e6-443c-aba4-257275de39a6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.340114 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z99j2\" (UniqueName: \"kubernetes.io/projected/3c800294-72e6-443c-aba4-257275de39a6-kube-api-access-z99j2\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.340147 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.340178 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3c800294-72e6-443c-aba4-257275de39a6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.340188 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3c800294-72e6-443c-aba4-257275de39a6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.397716 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c800294-72e6-443c-aba4-257275de39a6" path="/var/lib/kubelet/pods/3c800294-72e6-443c-aba4-257275de39a6/volumes" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.852528 4738 scope.go:117] "RemoveContainer" containerID="b10d32c08220340dfb8b86782fd6e199c81bd8f2a146118ddea9c49be4aeddf1" Mar 07 07:35:26 crc kubenswrapper[4738]: I0307 07:35:26.852795 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kn5q9" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.465348 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mr49p"] Mar 07 07:35:27 crc kubenswrapper[4738]: E0307 07:35:27.465634 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c800294-72e6-443c-aba4-257275de39a6" containerName="swift-ring-rebalance" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.465647 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c800294-72e6-443c-aba4-257275de39a6" containerName="swift-ring-rebalance" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.465805 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c800294-72e6-443c-aba4-257275de39a6" containerName="swift-ring-rebalance" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.466264 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.470764 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.473127 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.488034 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mr49p"] Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.558693 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.558780 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.558886 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.559018 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fglqs\" (UniqueName: \"kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.559146 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.559231 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660470 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660569 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fglqs\" (UniqueName: \"kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660624 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660651 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660720 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.660751 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.661360 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.662251 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.662258 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.674833 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.675628 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.680088 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fglqs\" (UniqueName: \"kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs\") pod \"swift-ring-rebalance-debug-mr49p\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:27 crc kubenswrapper[4738]: I0307 07:35:27.801282 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:28 crc kubenswrapper[4738]: I0307 07:35:28.079601 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mr49p"] Mar 07 07:35:28 crc kubenswrapper[4738]: W0307 07:35:28.084455 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65040c17_a9b3_4bf6_b40f_af1c5290f349.slice/crio-8e060013bde6f2af5f3411ca33c84bdd70871d0d89592476004f691f9449f961 WatchSource:0}: Error finding container 8e060013bde6f2af5f3411ca33c84bdd70871d0d89592476004f691f9449f961: Status 404 returned error can't find the container with id 8e060013bde6f2af5f3411ca33c84bdd70871d0d89592476004f691f9449f961 Mar 07 07:35:28 crc kubenswrapper[4738]: I0307 07:35:28.872596 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" event={"ID":"65040c17-a9b3-4bf6-b40f-af1c5290f349","Type":"ContainerStarted","Data":"d50e6b3fe9014d62705d98cdbe1fb11f448ed5b24777d4c759ef3020401b0842"} Mar 07 07:35:28 crc kubenswrapper[4738]: I0307 07:35:28.872855 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" event={"ID":"65040c17-a9b3-4bf6-b40f-af1c5290f349","Type":"ContainerStarted","Data":"8e060013bde6f2af5f3411ca33c84bdd70871d0d89592476004f691f9449f961"} Mar 07 07:35:28 crc kubenswrapper[4738]: I0307 07:35:28.891426 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" podStartSLOduration=1.891408371 podStartE2EDuration="1.891408371s" podCreationTimestamp="2026-03-07 07:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:28.886172269 +0000 UTC m=+2147.351159590" watchObservedRunningTime="2026-03-07 07:35:28.891408371 +0000 UTC m=+2147.356395692" Mar 07 07:35:30 crc kubenswrapper[4738]: I0307 07:35:30.896552 4738 generic.go:334] "Generic (PLEG): container finished" podID="65040c17-a9b3-4bf6-b40f-af1c5290f349" containerID="d50e6b3fe9014d62705d98cdbe1fb11f448ed5b24777d4c759ef3020401b0842" exitCode=0 Mar 07 07:35:30 crc kubenswrapper[4738]: I0307 07:35:30.896629 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" event={"ID":"65040c17-a9b3-4bf6-b40f-af1c5290f349","Type":"ContainerDied","Data":"d50e6b3fe9014d62705d98cdbe1fb11f448ed5b24777d4c759ef3020401b0842"} Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.186937 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229119 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229208 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229252 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229273 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229329 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fglqs\" (UniqueName: \"kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.229349 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf\") pod \"65040c17-a9b3-4bf6-b40f-af1c5290f349\" (UID: \"65040c17-a9b3-4bf6-b40f-af1c5290f349\") " Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.230385 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.230412 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.230856 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mr49p"] Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.235671 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs" (OuterVolumeSpecName: "kube-api-access-fglqs") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "kube-api-access-fglqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.238478 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mr49p"] Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.249246 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts" (OuterVolumeSpecName: "scripts") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.252318 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.255120 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "65040c17-a9b3-4bf6-b40f-af1c5290f349" (UID: "65040c17-a9b3-4bf6-b40f-af1c5290f349"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331110 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331377 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331391 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fglqs\" (UniqueName: \"kubernetes.io/projected/65040c17-a9b3-4bf6-b40f-af1c5290f349-kube-api-access-fglqs\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331405 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65040c17-a9b3-4bf6-b40f-af1c5290f349-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331417 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65040c17-a9b3-4bf6-b40f-af1c5290f349-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.331428 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65040c17-a9b3-4bf6-b40f-af1c5290f349-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.394593 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65040c17-a9b3-4bf6-b40f-af1c5290f349" path="/var/lib/kubelet/pods/65040c17-a9b3-4bf6-b40f-af1c5290f349/volumes" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.922189 4738 scope.go:117] "RemoveContainer" containerID="d50e6b3fe9014d62705d98cdbe1fb11f448ed5b24777d4c759ef3020401b0842" Mar 07 07:35:32 crc kubenswrapper[4738]: I0307 07:35:32.922239 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mr49p" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.404091 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-drfwn"] Mar 07 07:35:33 crc kubenswrapper[4738]: E0307 07:35:33.404460 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65040c17-a9b3-4bf6-b40f-af1c5290f349" containerName="swift-ring-rebalance" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.404477 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="65040c17-a9b3-4bf6-b40f-af1c5290f349" containerName="swift-ring-rebalance" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.404662 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="65040c17-a9b3-4bf6-b40f-af1c5290f349" containerName="swift-ring-rebalance" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.405107 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.409529 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.409502 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.428713 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-drfwn"] Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550513 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550593 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrrvq\" (UniqueName: \"kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550638 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550670 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550689 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.550885 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.652902 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.652975 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrrvq\" (UniqueName: \"kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.653024 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.653056 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.653083 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.653791 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.653131 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.654207 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.654596 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.657727 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.659289 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.676610 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrrvq\" (UniqueName: \"kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq\") pod \"swift-ring-rebalance-debug-drfwn\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:33 crc kubenswrapper[4738]: I0307 07:35:33.746528 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:34 crc kubenswrapper[4738]: I0307 07:35:34.152348 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-drfwn"] Mar 07 07:35:34 crc kubenswrapper[4738]: I0307 07:35:34.954805 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" event={"ID":"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8","Type":"ContainerStarted","Data":"eff191649166959173501b3653c56a69969a882daf49d883093ced5e1624f39e"} Mar 07 07:35:34 crc kubenswrapper[4738]: I0307 07:35:34.955430 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" event={"ID":"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8","Type":"ContainerStarted","Data":"d5a9608778cd571e2763b8d2e0ef1056d7421f7078014f4fbff050f9c168480a"} Mar 07 07:35:35 crc kubenswrapper[4738]: I0307 07:35:35.969703 4738 generic.go:334] "Generic (PLEG): container finished" podID="5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" containerID="eff191649166959173501b3653c56a69969a882daf49d883093ced5e1624f39e" exitCode=0 Mar 07 07:35:35 crc kubenswrapper[4738]: I0307 07:35:35.969813 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" event={"ID":"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8","Type":"ContainerDied","Data":"eff191649166959173501b3653c56a69969a882daf49d883093ced5e1624f39e"} Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.312566 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.378683 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-drfwn"] Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.385642 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-drfwn"] Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.417911 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.418044 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.418098 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.418132 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.418212 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.418294 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vrrvq\" (UniqueName: \"kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq\") pod \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\" (UID: \"5bd2bedb-dfde-4d96-b4eb-632eda5f81c8\") " Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.419027 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.419315 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.423637 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq" (OuterVolumeSpecName: "kube-api-access-vrrvq") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "kube-api-access-vrrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.438195 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts" (OuterVolumeSpecName: "scripts") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.439034 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.442221 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" (UID: "5bd2bedb-dfde-4d96-b4eb-632eda5f81c8"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.520909 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.520953 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.520971 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.520992 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.521011 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vrrvq\" (UniqueName: \"kubernetes.io/projected/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-kube-api-access-vrrvq\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:37 crc kubenswrapper[4738]: I0307 07:35:37.521031 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.005668 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5a9608778cd571e2763b8d2e0ef1056d7421f7078014f4fbff050f9c168480a" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.005796 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-drfwn" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.403513 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" path="/var/lib/kubelet/pods/5bd2bedb-dfde-4d96-b4eb-632eda5f81c8/volumes" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.583431 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2"] Mar 07 07:35:38 crc kubenswrapper[4738]: E0307 07:35:38.583765 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" containerName="swift-ring-rebalance" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.583784 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" containerName="swift-ring-rebalance" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.583945 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bd2bedb-dfde-4d96-b4eb-632eda5f81c8" containerName="swift-ring-rebalance" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.584531 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.591465 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2"] Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.594693 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.594884 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.739585 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.739848 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.739935 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.740029 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tntx8\" (UniqueName: \"kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.740149 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.740241 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841244 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841621 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841699 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841748 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841801 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.841851 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tntx8\" (UniqueName: \"kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.842612 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.842813 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.843254 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.846643 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.847645 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.858685 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tntx8\" (UniqueName: \"kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8\") pod \"swift-ring-rebalance-debug-zmhw2\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:38 crc kubenswrapper[4738]: I0307 07:35:38.911236 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:39 crc kubenswrapper[4738]: I0307 07:35:39.344312 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2"] Mar 07 07:35:40 crc kubenswrapper[4738]: I0307 07:35:40.024499 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" event={"ID":"5f216e5d-f9e1-4513-bb1a-dcddb3febb29","Type":"ContainerStarted","Data":"713b29849e2af4600c1bdfcd486d63bdb05382be1d857b895deac05cfc98158d"} Mar 07 07:35:40 crc kubenswrapper[4738]: I0307 07:35:40.024856 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" event={"ID":"5f216e5d-f9e1-4513-bb1a-dcddb3febb29","Type":"ContainerStarted","Data":"a51d1558801ddcf1a4533c0ed10411411035221266449c3bb0e01ee78c70c007"} Mar 07 07:35:40 crc kubenswrapper[4738]: I0307 07:35:40.055996 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" podStartSLOduration=2.055975241 podStartE2EDuration="2.055975241s" podCreationTimestamp="2026-03-07 07:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:40.051189071 +0000 UTC m=+2158.516176402" watchObservedRunningTime="2026-03-07 07:35:40.055975241 +0000 UTC m=+2158.520962572" Mar 07 07:35:41 crc kubenswrapper[4738]: I0307 07:35:41.035755 4738 generic.go:334] "Generic (PLEG): container finished" podID="5f216e5d-f9e1-4513-bb1a-dcddb3febb29" containerID="713b29849e2af4600c1bdfcd486d63bdb05382be1d857b895deac05cfc98158d" exitCode=0 Mar 07 07:35:41 crc kubenswrapper[4738]: I0307 07:35:41.035807 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" event={"ID":"5f216e5d-f9e1-4513-bb1a-dcddb3febb29","Type":"ContainerDied","Data":"713b29849e2af4600c1bdfcd486d63bdb05382be1d857b895deac05cfc98158d"} Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.366225 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.400854 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.400975 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.401037 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.401090 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.401125 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tntx8\" (UniqueName: \"kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.401230 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf\") pod \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\" (UID: \"5f216e5d-f9e1-4513-bb1a-dcddb3febb29\") " Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.402064 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.402224 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.410441 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8" (OuterVolumeSpecName: "kube-api-access-tntx8") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "kube-api-access-tntx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.429809 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts" (OuterVolumeSpecName: "scripts") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.444363 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.450201 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5f216e5d-f9e1-4513-bb1a-dcddb3febb29" (UID: "5f216e5d-f9e1-4513-bb1a-dcddb3febb29"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.501042 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2"] Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.501420 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2"] Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503825 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503849 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503859 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tntx8\" (UniqueName: \"kubernetes.io/projected/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-kube-api-access-tntx8\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503869 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503878 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:42 crc kubenswrapper[4738]: I0307 07:35:42.503887 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f216e5d-f9e1-4513-bb1a-dcddb3febb29-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.051849 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a51d1558801ddcf1a4533c0ed10411411035221266449c3bb0e01ee78c70c007" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.051917 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zmhw2" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.610693 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7"] Mar 07 07:35:43 crc kubenswrapper[4738]: E0307 07:35:43.611032 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f216e5d-f9e1-4513-bb1a-dcddb3febb29" containerName="swift-ring-rebalance" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.611050 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f216e5d-f9e1-4513-bb1a-dcddb3febb29" containerName="swift-ring-rebalance" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.611259 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f216e5d-f9e1-4513-bb1a-dcddb3febb29" containerName="swift-ring-rebalance" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.611700 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.614276 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.614532 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.640635 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7"] Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.743940 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.744057 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.744143 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.744328 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqxzl\" (UniqueName: \"kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.744443 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.744476 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845335 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845412 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845437 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845461 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqxzl\" (UniqueName: \"kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845488 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845505 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.845975 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.846143 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.846924 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.850356 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.859915 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.860969 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqxzl\" (UniqueName: \"kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl\") pod \"swift-ring-rebalance-debug-tg7x7\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:43 crc kubenswrapper[4738]: I0307 07:35:43.933130 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:44 crc kubenswrapper[4738]: I0307 07:35:44.163373 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7"] Mar 07 07:35:44 crc kubenswrapper[4738]: I0307 07:35:44.399456 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f216e5d-f9e1-4513-bb1a-dcddb3febb29" path="/var/lib/kubelet/pods/5f216e5d-f9e1-4513-bb1a-dcddb3febb29/volumes" Mar 07 07:35:45 crc kubenswrapper[4738]: I0307 07:35:45.081107 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" event={"ID":"5f04e746-4687-4a76-9e5c-3f752b2af60d","Type":"ContainerStarted","Data":"8f2a192a2bfef3c624894e8cdb76cfbc7091a3b730fba395a0d60640f8efe5f7"} Mar 07 07:35:45 crc kubenswrapper[4738]: I0307 07:35:45.081180 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" event={"ID":"5f04e746-4687-4a76-9e5c-3f752b2af60d","Type":"ContainerStarted","Data":"e090b4b24462a12df2bfc8d1374155b2fded3a0b3a4e173cef910fe8f93832a1"} Mar 07 07:35:45 crc kubenswrapper[4738]: I0307 07:35:45.105134 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" podStartSLOduration=2.105111001 podStartE2EDuration="2.105111001s" podCreationTimestamp="2026-03-07 07:35:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:45.099779237 +0000 UTC m=+2163.564766578" watchObservedRunningTime="2026-03-07 07:35:45.105111001 +0000 UTC m=+2163.570098332" Mar 07 07:35:46 crc kubenswrapper[4738]: I0307 07:35:46.091029 4738 generic.go:334] "Generic (PLEG): container finished" podID="5f04e746-4687-4a76-9e5c-3f752b2af60d" containerID="8f2a192a2bfef3c624894e8cdb76cfbc7091a3b730fba395a0d60640f8efe5f7" exitCode=0 Mar 07 07:35:46 crc kubenswrapper[4738]: I0307 07:35:46.091127 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" event={"ID":"5f04e746-4687-4a76-9e5c-3f752b2af60d","Type":"ContainerDied","Data":"8f2a192a2bfef3c624894e8cdb76cfbc7091a3b730fba395a0d60640f8efe5f7"} Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.447633 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.480262 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7"] Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.488051 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7"] Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.598908 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.599247 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.599439 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqxzl\" (UniqueName: \"kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.599682 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.599860 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.600015 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf\") pod \"5f04e746-4687-4a76-9e5c-3f752b2af60d\" (UID: \"5f04e746-4687-4a76-9e5c-3f752b2af60d\") " Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.600266 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.600415 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.600814 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.600926 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f04e746-4687-4a76-9e5c-3f752b2af60d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.604601 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl" (OuterVolumeSpecName: "kube-api-access-fqxzl") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "kube-api-access-fqxzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.624351 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.634886 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts" (OuterVolumeSpecName: "scripts") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.636476 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5f04e746-4687-4a76-9e5c-3f752b2af60d" (UID: "5f04e746-4687-4a76-9e5c-3f752b2af60d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.703098 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.703219 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqxzl\" (UniqueName: \"kubernetes.io/projected/5f04e746-4687-4a76-9e5c-3f752b2af60d-kube-api-access-fqxzl\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.703242 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f04e746-4687-4a76-9e5c-3f752b2af60d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:47 crc kubenswrapper[4738]: I0307 07:35:47.703261 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f04e746-4687-4a76-9e5c-3f752b2af60d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.113050 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e090b4b24462a12df2bfc8d1374155b2fded3a0b3a4e173cef910fe8f93832a1" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.113296 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-tg7x7" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.401632 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f04e746-4687-4a76-9e5c-3f752b2af60d" path="/var/lib/kubelet/pods/5f04e746-4687-4a76-9e5c-3f752b2af60d/volumes" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.675699 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t"] Mar 07 07:35:48 crc kubenswrapper[4738]: E0307 07:35:48.676062 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f04e746-4687-4a76-9e5c-3f752b2af60d" containerName="swift-ring-rebalance" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.676082 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f04e746-4687-4a76-9e5c-3f752b2af60d" containerName="swift-ring-rebalance" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.676363 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f04e746-4687-4a76-9e5c-3f752b2af60d" containerName="swift-ring-rebalance" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.677317 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.680205 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.682227 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.695291 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t"] Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720496 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720587 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720619 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720654 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720683 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h55wj\" (UniqueName: \"kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.720710 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.821794 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.821923 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.821962 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.822012 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.822048 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h55wj\" (UniqueName: \"kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.822077 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.822773 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.823381 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.824419 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.832261 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.840726 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:48 crc kubenswrapper[4738]: I0307 07:35:48.842285 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h55wj\" (UniqueName: \"kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj\") pod \"swift-ring-rebalance-debug-c5b4t\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:49 crc kubenswrapper[4738]: I0307 07:35:49.003934 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:49 crc kubenswrapper[4738]: I0307 07:35:49.514485 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t"] Mar 07 07:35:49 crc kubenswrapper[4738]: W0307 07:35:49.516917 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85e41874_8625_49a5_bd0d_0f48dc049b28.slice/crio-48d1250b99c5b506a31ffcf60778b5bfe0704695722271dc796c47e07b595ac0 WatchSource:0}: Error finding container 48d1250b99c5b506a31ffcf60778b5bfe0704695722271dc796c47e07b595ac0: Status 404 returned error can't find the container with id 48d1250b99c5b506a31ffcf60778b5bfe0704695722271dc796c47e07b595ac0 Mar 07 07:35:50 crc kubenswrapper[4738]: I0307 07:35:50.151314 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" event={"ID":"85e41874-8625-49a5-bd0d-0f48dc049b28","Type":"ContainerStarted","Data":"637f6298b77264298786b4519fc2e83a968f3ac127127a960847f9d3e4cbee71"} Mar 07 07:35:50 crc kubenswrapper[4738]: I0307 07:35:50.151684 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" event={"ID":"85e41874-8625-49a5-bd0d-0f48dc049b28","Type":"ContainerStarted","Data":"48d1250b99c5b506a31ffcf60778b5bfe0704695722271dc796c47e07b595ac0"} Mar 07 07:35:51 crc kubenswrapper[4738]: I0307 07:35:51.165586 4738 generic.go:334] "Generic (PLEG): container finished" podID="85e41874-8625-49a5-bd0d-0f48dc049b28" containerID="637f6298b77264298786b4519fc2e83a968f3ac127127a960847f9d3e4cbee71" exitCode=0 Mar 07 07:35:51 crc kubenswrapper[4738]: I0307 07:35:51.166046 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" event={"ID":"85e41874-8625-49a5-bd0d-0f48dc049b28","Type":"ContainerDied","Data":"637f6298b77264298786b4519fc2e83a968f3ac127127a960847f9d3e4cbee71"} Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.484743 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.528034 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t"] Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.535224 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t"] Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681196 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681256 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681321 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h55wj\" (UniqueName: \"kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681429 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.681467 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift\") pod \"85e41874-8625-49a5-bd0d-0f48dc049b28\" (UID: \"85e41874-8625-49a5-bd0d-0f48dc049b28\") " Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.682099 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.682537 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.687835 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj" (OuterVolumeSpecName: "kube-api-access-h55wj") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "kube-api-access-h55wj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.700116 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts" (OuterVolumeSpecName: "scripts") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.703283 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.704007 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "85e41874-8625-49a5-bd0d-0f48dc049b28" (UID: "85e41874-8625-49a5-bd0d-0f48dc049b28"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.782968 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.783000 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.783014 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/85e41874-8625-49a5-bd0d-0f48dc049b28-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.783026 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/85e41874-8625-49a5-bd0d-0f48dc049b28-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.783037 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/85e41874-8625-49a5-bd0d-0f48dc049b28-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:52 crc kubenswrapper[4738]: I0307 07:35:52.783047 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h55wj\" (UniqueName: \"kubernetes.io/projected/85e41874-8625-49a5-bd0d-0f48dc049b28-kube-api-access-h55wj\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.190821 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48d1250b99c5b506a31ffcf60778b5bfe0704695722271dc796c47e07b595ac0" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.190913 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5b4t" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.665070 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pm68n"] Mar 07 07:35:53 crc kubenswrapper[4738]: E0307 07:35:53.665847 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="85e41874-8625-49a5-bd0d-0f48dc049b28" containerName="swift-ring-rebalance" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.665867 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="85e41874-8625-49a5-bd0d-0f48dc049b28" containerName="swift-ring-rebalance" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.666082 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="85e41874-8625-49a5-bd0d-0f48dc049b28" containerName="swift-ring-rebalance" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.666712 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.676383 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pm68n"] Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695226 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695448 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695515 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzdfn\" (UniqueName: \"kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695528 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695693 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695777 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.695816 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.696497 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796627 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796687 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzdfn\" (UniqueName: \"kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796740 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796777 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796805 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.796848 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.797361 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.797596 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.797597 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.800800 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.801342 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:53 crc kubenswrapper[4738]: I0307 07:35:53.813147 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzdfn\" (UniqueName: \"kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn\") pod \"swift-ring-rebalance-debug-pm68n\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:54 crc kubenswrapper[4738]: I0307 07:35:54.014692 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:54 crc kubenswrapper[4738]: I0307 07:35:54.396529 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="85e41874-8625-49a5-bd0d-0f48dc049b28" path="/var/lib/kubelet/pods/85e41874-8625-49a5-bd0d-0f48dc049b28/volumes" Mar 07 07:35:54 crc kubenswrapper[4738]: I0307 07:35:54.453791 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pm68n"] Mar 07 07:35:55 crc kubenswrapper[4738]: I0307 07:35:55.224469 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" event={"ID":"59f2c9c6-732d-4834-9ea4-f4805e8499c1","Type":"ContainerStarted","Data":"67d2be20671c313b66cc7a0bfcd324ba91a11ed5a0980040772de4ba3b34d5bf"} Mar 07 07:35:55 crc kubenswrapper[4738]: I0307 07:35:55.224797 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" event={"ID":"59f2c9c6-732d-4834-9ea4-f4805e8499c1","Type":"ContainerStarted","Data":"976103c7d2ce523c97122ff4dac7a6eecc183cb48ff8902945039886a39630e9"} Mar 07 07:35:55 crc kubenswrapper[4738]: I0307 07:35:55.243887 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" podStartSLOduration=2.243870542 podStartE2EDuration="2.243870542s" podCreationTimestamp="2026-03-07 07:35:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:35:55.242505094 +0000 UTC m=+2173.707492415" watchObservedRunningTime="2026-03-07 07:35:55.243870542 +0000 UTC m=+2173.708857863" Mar 07 07:35:56 crc kubenswrapper[4738]: I0307 07:35:56.238759 4738 generic.go:334] "Generic (PLEG): container finished" podID="59f2c9c6-732d-4834-9ea4-f4805e8499c1" containerID="67d2be20671c313b66cc7a0bfcd324ba91a11ed5a0980040772de4ba3b34d5bf" exitCode=0 Mar 07 07:35:56 crc kubenswrapper[4738]: I0307 07:35:56.238802 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" event={"ID":"59f2c9c6-732d-4834-9ea4-f4805e8499c1","Type":"ContainerDied","Data":"67d2be20671c313b66cc7a0bfcd324ba91a11ed5a0980040772de4ba3b34d5bf"} Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.645354 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.692695 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pm68n"] Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.700497 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pm68n"] Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.759909 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.760062 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.760103 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.760328 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzdfn\" (UniqueName: \"kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.760446 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.760767 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.761128 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices\") pod \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\" (UID: \"59f2c9c6-732d-4834-9ea4-f4805e8499c1\") " Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.761721 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.762665 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/59f2c9c6-732d-4834-9ea4-f4805e8499c1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.762724 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.765360 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn" (OuterVolumeSpecName: "kube-api-access-mzdfn") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "kube-api-access-mzdfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.785440 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.792345 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.798495 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts" (OuterVolumeSpecName: "scripts") pod "59f2c9c6-732d-4834-9ea4-f4805e8499c1" (UID: "59f2c9c6-732d-4834-9ea4-f4805e8499c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.864558 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.864602 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/59f2c9c6-732d-4834-9ea4-f4805e8499c1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.864622 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzdfn\" (UniqueName: \"kubernetes.io/projected/59f2c9c6-732d-4834-9ea4-f4805e8499c1-kube-api-access-mzdfn\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:57 crc kubenswrapper[4738]: I0307 07:35:57.864639 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59f2c9c6-732d-4834-9ea4-f4805e8499c1-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.256758 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="976103c7d2ce523c97122ff4dac7a6eecc183cb48ff8902945039886a39630e9" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.256835 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pm68n" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.395600 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59f2c9c6-732d-4834-9ea4-f4805e8499c1" path="/var/lib/kubelet/pods/59f2c9c6-732d-4834-9ea4-f4805e8499c1/volumes" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.863344 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq"] Mar 07 07:35:58 crc kubenswrapper[4738]: E0307 07:35:58.863616 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59f2c9c6-732d-4834-9ea4-f4805e8499c1" containerName="swift-ring-rebalance" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.863630 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="59f2c9c6-732d-4834-9ea4-f4805e8499c1" containerName="swift-ring-rebalance" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.863819 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="59f2c9c6-732d-4834-9ea4-f4805e8499c1" containerName="swift-ring-rebalance" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.864378 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.866361 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.868103 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877043 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877088 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877140 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877220 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877235 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs45h\" (UniqueName: \"kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.877350 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.886407 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq"] Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978520 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978564 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs45h\" (UniqueName: \"kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978616 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978645 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978681 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.978759 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.979034 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.979463 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.979806 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.984613 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:58 crc kubenswrapper[4738]: I0307 07:35:58.986565 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:59 crc kubenswrapper[4738]: I0307 07:35:59.004636 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs45h\" (UniqueName: \"kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h\") pod \"swift-ring-rebalance-debug-lwwxq\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:59 crc kubenswrapper[4738]: I0307 07:35:59.178485 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:35:59 crc kubenswrapper[4738]: I0307 07:35:59.699534 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq"] Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.148632 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547816-nk69t"] Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.150094 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.152680 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.152791 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.153272 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.170670 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-nk69t"] Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.272829 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" event={"ID":"14042d19-aa45-453a-978c-19a470726ba6","Type":"ContainerStarted","Data":"4e355a8f5aa677f793af95b57852a95a785489978e5fb65aaa821790532c8429"} Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.272884 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" event={"ID":"14042d19-aa45-453a-978c-19a470726ba6","Type":"ContainerStarted","Data":"e64e71ab8407384ba66f7a614c1f3e8948f121453a2553c4f6c96ae8bf01b41e"} Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.297127 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" podStartSLOduration=2.297102093 podStartE2EDuration="2.297102093s" podCreationTimestamp="2026-03-07 07:35:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:00.285723735 +0000 UTC m=+2178.750711056" watchObservedRunningTime="2026-03-07 07:36:00.297102093 +0000 UTC m=+2178.762089454" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.297369 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xw9b\" (UniqueName: \"kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b\") pod \"auto-csr-approver-29547816-nk69t\" (UID: \"c531e025-5682-49d9-9e7e-0e7191fc00a4\") " pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.398540 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xw9b\" (UniqueName: \"kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b\") pod \"auto-csr-approver-29547816-nk69t\" (UID: \"c531e025-5682-49d9-9e7e-0e7191fc00a4\") " pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.421454 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xw9b\" (UniqueName: \"kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b\") pod \"auto-csr-approver-29547816-nk69t\" (UID: \"c531e025-5682-49d9-9e7e-0e7191fc00a4\") " pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.467696 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:00 crc kubenswrapper[4738]: I0307 07:36:00.900289 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-nk69t"] Mar 07 07:36:00 crc kubenswrapper[4738]: W0307 07:36:00.911897 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc531e025_5682_49d9_9e7e_0e7191fc00a4.slice/crio-b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb WatchSource:0}: Error finding container b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb: Status 404 returned error can't find the container with id b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb Mar 07 07:36:01 crc kubenswrapper[4738]: I0307 07:36:01.284435 4738 generic.go:334] "Generic (PLEG): container finished" podID="14042d19-aa45-453a-978c-19a470726ba6" containerID="4e355a8f5aa677f793af95b57852a95a785489978e5fb65aaa821790532c8429" exitCode=0 Mar 07 07:36:01 crc kubenswrapper[4738]: I0307 07:36:01.284506 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" event={"ID":"14042d19-aa45-453a-978c-19a470726ba6","Type":"ContainerDied","Data":"4e355a8f5aa677f793af95b57852a95a785489978e5fb65aaa821790532c8429"} Mar 07 07:36:01 crc kubenswrapper[4738]: I0307 07:36:01.286630 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-nk69t" event={"ID":"c531e025-5682-49d9-9e7e-0e7191fc00a4","Type":"ContainerStarted","Data":"b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb"} Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.635679 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.684759 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq"] Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.689999 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq"] Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747191 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747236 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747259 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747328 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs45h\" (UniqueName: \"kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747392 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.747412 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf\") pod \"14042d19-aa45-453a-978c-19a470726ba6\" (UID: \"14042d19-aa45-453a-978c-19a470726ba6\") " Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.748270 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.748381 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.753364 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h" (OuterVolumeSpecName: "kube-api-access-hs45h") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "kube-api-access-hs45h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.767533 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.768345 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts" (OuterVolumeSpecName: "scripts") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.770092 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "14042d19-aa45-453a-978c-19a470726ba6" (UID: "14042d19-aa45-453a-978c-19a470726ba6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849104 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849143 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849172 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14042d19-aa45-453a-978c-19a470726ba6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849188 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs45h\" (UniqueName: \"kubernetes.io/projected/14042d19-aa45-453a-978c-19a470726ba6-kube-api-access-hs45h\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849201 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14042d19-aa45-453a-978c-19a470726ba6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:02 crc kubenswrapper[4738]: I0307 07:36:02.849212 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14042d19-aa45-453a-978c-19a470726ba6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.304215 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-nk69t" event={"ID":"c531e025-5682-49d9-9e7e-0e7191fc00a4","Type":"ContainerStarted","Data":"96396f0f7143c90fc689a396468e6743f8cdeac5e8a2ded6b6d45420ea48a766"} Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.306140 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64e71ab8407384ba66f7a614c1f3e8948f121453a2553c4f6c96ae8bf01b41e" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.306226 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-lwwxq" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.326816 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547816-nk69t" podStartSLOduration=1.367867674 podStartE2EDuration="3.326798514s" podCreationTimestamp="2026-03-07 07:36:00 +0000 UTC" firstStartedPulling="2026-03-07 07:36:00.91409994 +0000 UTC m=+2179.379087271" lastFinishedPulling="2026-03-07 07:36:02.87303079 +0000 UTC m=+2181.338018111" observedRunningTime="2026-03-07 07:36:03.325621352 +0000 UTC m=+2181.790608683" watchObservedRunningTime="2026-03-07 07:36:03.326798514 +0000 UTC m=+2181.791785845" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.825454 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd"] Mar 07 07:36:03 crc kubenswrapper[4738]: E0307 07:36:03.825780 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14042d19-aa45-453a-978c-19a470726ba6" containerName="swift-ring-rebalance" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.825795 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="14042d19-aa45-453a-978c-19a470726ba6" containerName="swift-ring-rebalance" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.825959 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="14042d19-aa45-453a-978c-19a470726ba6" containerName="swift-ring-rebalance" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.826537 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.828796 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.830481 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.836682 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd"] Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.966795 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.966834 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.966891 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.966913 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.966957 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:03 crc kubenswrapper[4738]: I0307 07:36:03.967014 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxxkc\" (UniqueName: \"kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068249 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxxkc\" (UniqueName: \"kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068348 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068375 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068423 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068452 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.068494 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.069629 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.069789 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.070126 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.073620 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.073859 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.094703 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxxkc\" (UniqueName: \"kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc\") pod \"swift-ring-rebalance-debug-5p4kd\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.143357 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.315534 4738 generic.go:334] "Generic (PLEG): container finished" podID="c531e025-5682-49d9-9e7e-0e7191fc00a4" containerID="96396f0f7143c90fc689a396468e6743f8cdeac5e8a2ded6b6d45420ea48a766" exitCode=0 Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.315796 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-nk69t" event={"ID":"c531e025-5682-49d9-9e7e-0e7191fc00a4","Type":"ContainerDied","Data":"96396f0f7143c90fc689a396468e6743f8cdeac5e8a2ded6b6d45420ea48a766"} Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.394764 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14042d19-aa45-453a-978c-19a470726ba6" path="/var/lib/kubelet/pods/14042d19-aa45-453a-978c-19a470726ba6/volumes" Mar 07 07:36:04 crc kubenswrapper[4738]: I0307 07:36:04.552059 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd"] Mar 07 07:36:04 crc kubenswrapper[4738]: W0307 07:36:04.574387 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf45625a_7c60_499c_a958_4634d519480a.slice/crio-edee959ddcdd3c5a931a0828effe63720668350ad12715f328502946db097139 WatchSource:0}: Error finding container edee959ddcdd3c5a931a0828effe63720668350ad12715f328502946db097139: Status 404 returned error can't find the container with id edee959ddcdd3c5a931a0828effe63720668350ad12715f328502946db097139 Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.328235 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" event={"ID":"cf45625a-7c60-499c-a958-4634d519480a","Type":"ContainerStarted","Data":"4e236a0f9a3254972c2324ba0316af7a8f3f2d67a9f186dfa09700fe29c29c46"} Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.328629 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" event={"ID":"cf45625a-7c60-499c-a958-4634d519480a","Type":"ContainerStarted","Data":"edee959ddcdd3c5a931a0828effe63720668350ad12715f328502946db097139"} Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.354888 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" podStartSLOduration=2.354870088 podStartE2EDuration="2.354870088s" podCreationTimestamp="2026-03-07 07:36:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:05.347382915 +0000 UTC m=+2183.812370326" watchObservedRunningTime="2026-03-07 07:36:05.354870088 +0000 UTC m=+2183.819857419" Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.714446 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.815766 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xw9b\" (UniqueName: \"kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b\") pod \"c531e025-5682-49d9-9e7e-0e7191fc00a4\" (UID: \"c531e025-5682-49d9-9e7e-0e7191fc00a4\") " Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.829065 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b" (OuterVolumeSpecName: "kube-api-access-8xw9b") pod "c531e025-5682-49d9-9e7e-0e7191fc00a4" (UID: "c531e025-5682-49d9-9e7e-0e7191fc00a4"). InnerVolumeSpecName "kube-api-access-8xw9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:05 crc kubenswrapper[4738]: I0307 07:36:05.918769 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xw9b\" (UniqueName: \"kubernetes.io/projected/c531e025-5682-49d9-9e7e-0e7191fc00a4-kube-api-access-8xw9b\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.337859 4738 generic.go:334] "Generic (PLEG): container finished" podID="cf45625a-7c60-499c-a958-4634d519480a" containerID="4e236a0f9a3254972c2324ba0316af7a8f3f2d67a9f186dfa09700fe29c29c46" exitCode=0 Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.337926 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" event={"ID":"cf45625a-7c60-499c-a958-4634d519480a","Type":"ContainerDied","Data":"4e236a0f9a3254972c2324ba0316af7a8f3f2d67a9f186dfa09700fe29c29c46"} Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.339346 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547816-nk69t" event={"ID":"c531e025-5682-49d9-9e7e-0e7191fc00a4","Type":"ContainerDied","Data":"b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb"} Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.339407 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6d4db11d2523f46165ca0aaa330d23fda2b43d00b867d458610b526a4776feb" Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.339418 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547816-nk69t" Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.789720 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-4h5qr"] Mar 07 07:36:06 crc kubenswrapper[4738]: I0307 07:36:06.795523 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547810-4h5qr"] Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.639320 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.692826 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd"] Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.692942 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd"] Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744220 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744356 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744400 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxxkc\" (UniqueName: \"kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744420 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744459 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.744482 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf\") pod \"cf45625a-7c60-499c-a958-4634d519480a\" (UID: \"cf45625a-7c60-499c-a958-4634d519480a\") " Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.745214 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.745309 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.749371 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc" (OuterVolumeSpecName: "kube-api-access-cxxkc") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "kube-api-access-cxxkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.768371 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.769114 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.770691 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts" (OuterVolumeSpecName: "scripts") pod "cf45625a-7c60-499c-a958-4634d519480a" (UID: "cf45625a-7c60-499c-a958-4634d519480a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847000 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847036 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847047 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxxkc\" (UniqueName: \"kubernetes.io/projected/cf45625a-7c60-499c-a958-4634d519480a-kube-api-access-cxxkc\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847057 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cf45625a-7c60-499c-a958-4634d519480a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847067 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/cf45625a-7c60-499c-a958-4634d519480a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:07 crc kubenswrapper[4738]: I0307 07:36:07.847078 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/cf45625a-7c60-499c-a958-4634d519480a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.363086 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edee959ddcdd3c5a931a0828effe63720668350ad12715f328502946db097139" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.363136 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-5p4kd" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.393264 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf45625a-7c60-499c-a958-4634d519480a" path="/var/lib/kubelet/pods/cf45625a-7c60-499c-a958-4634d519480a/volumes" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.393711 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04ed723-5f2f-402a-96c8-7249338d43ec" path="/var/lib/kubelet/pods/d04ed723-5f2f-402a-96c8-7249338d43ec/volumes" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.905173 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx"] Mar 07 07:36:08 crc kubenswrapper[4738]: E0307 07:36:08.905438 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c531e025-5682-49d9-9e7e-0e7191fc00a4" containerName="oc" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.905449 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="c531e025-5682-49d9-9e7e-0e7191fc00a4" containerName="oc" Mar 07 07:36:08 crc kubenswrapper[4738]: E0307 07:36:08.905465 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf45625a-7c60-499c-a958-4634d519480a" containerName="swift-ring-rebalance" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.905472 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf45625a-7c60-499c-a958-4634d519480a" containerName="swift-ring-rebalance" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.905596 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="c531e025-5682-49d9-9e7e-0e7191fc00a4" containerName="oc" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.905616 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf45625a-7c60-499c-a958-4634d519480a" containerName="swift-ring-rebalance" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.906028 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.909396 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.909919 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.927832 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx"] Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968243 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968625 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968704 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968734 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhhk7\" (UniqueName: \"kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968774 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:08 crc kubenswrapper[4738]: I0307 07:36:08.968893 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070112 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070227 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070295 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070349 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhhk7\" (UniqueName: \"kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070416 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070478 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.070970 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.071002 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.071097 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.073851 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.074582 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.085813 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhhk7\" (UniqueName: \"kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7\") pod \"swift-ring-rebalance-debug-kgwjx\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.229674 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:09 crc kubenswrapper[4738]: I0307 07:36:09.658048 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx"] Mar 07 07:36:09 crc kubenswrapper[4738]: W0307 07:36:09.670498 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06a4b7ec_ac7a_4d5d_97f2_3f7c22e39642.slice/crio-c3474e6dd0dd46089d1c668492492a83d6a0d5a454a9eaf5fc0a3bdddb842a9d WatchSource:0}: Error finding container c3474e6dd0dd46089d1c668492492a83d6a0d5a454a9eaf5fc0a3bdddb842a9d: Status 404 returned error can't find the container with id c3474e6dd0dd46089d1c668492492a83d6a0d5a454a9eaf5fc0a3bdddb842a9d Mar 07 07:36:10 crc kubenswrapper[4738]: I0307 07:36:10.398605 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" event={"ID":"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642","Type":"ContainerStarted","Data":"a396130de5bdade456323f174fc65b9350f0dc57450b18af6974f1cc71c2d222"} Mar 07 07:36:10 crc kubenswrapper[4738]: I0307 07:36:10.398665 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" event={"ID":"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642","Type":"ContainerStarted","Data":"c3474e6dd0dd46089d1c668492492a83d6a0d5a454a9eaf5fc0a3bdddb842a9d"} Mar 07 07:36:10 crc kubenswrapper[4738]: I0307 07:36:10.419515 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" podStartSLOduration=2.419499649 podStartE2EDuration="2.419499649s" podCreationTimestamp="2026-03-07 07:36:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:10.414912184 +0000 UTC m=+2188.879899505" watchObservedRunningTime="2026-03-07 07:36:10.419499649 +0000 UTC m=+2188.884486970" Mar 07 07:36:11 crc kubenswrapper[4738]: I0307 07:36:11.404674 4738 generic.go:334] "Generic (PLEG): container finished" podID="06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" containerID="a396130de5bdade456323f174fc65b9350f0dc57450b18af6974f1cc71c2d222" exitCode=0 Mar 07 07:36:11 crc kubenswrapper[4738]: I0307 07:36:11.404738 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" event={"ID":"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642","Type":"ContainerDied","Data":"a396130de5bdade456323f174fc65b9350f0dc57450b18af6974f1cc71c2d222"} Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.727860 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.758203 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx"] Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.764275 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx"] Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.929903 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.929994 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhhk7\" (UniqueName: \"kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.930055 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.930149 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.930294 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.930330 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts\") pod \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\" (UID: \"06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642\") " Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.930946 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.931104 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.937249 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7" (OuterVolumeSpecName: "kube-api-access-qhhk7") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "kube-api-access-qhhk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.953727 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts" (OuterVolumeSpecName: "scripts") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.960803 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:12 crc kubenswrapper[4738]: I0307 07:36:12.969378 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" (UID: "06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032085 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032137 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032184 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032212 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhhk7\" (UniqueName: \"kubernetes.io/projected/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-kube-api-access-qhhk7\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032239 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.032265 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.444378 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3474e6dd0dd46089d1c668492492a83d6a0d5a454a9eaf5fc0a3bdddb842a9d" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.444639 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-kgwjx" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.988608 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xng5r"] Mar 07 07:36:13 crc kubenswrapper[4738]: E0307 07:36:13.988960 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" containerName="swift-ring-rebalance" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.988977 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" containerName="swift-ring-rebalance" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.989175 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" containerName="swift-ring-rebalance" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.989796 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.992406 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:13 crc kubenswrapper[4738]: I0307 07:36:13.993087 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.004307 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xng5r"] Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047359 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047433 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047469 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047509 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047663 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z4ht\" (UniqueName: \"kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.047749 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149397 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z4ht\" (UniqueName: \"kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149484 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149525 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149568 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149598 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.149640 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.151022 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.151061 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.151431 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.163556 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.169902 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.171147 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z4ht\" (UniqueName: \"kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht\") pod \"swift-ring-rebalance-debug-xng5r\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.319511 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.397459 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642" path="/var/lib/kubelet/pods/06a4b7ec-ac7a-4d5d-97f2-3f7c22e39642/volumes" Mar 07 07:36:14 crc kubenswrapper[4738]: I0307 07:36:14.601245 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xng5r"] Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.311246 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.313346 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.324650 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.462013 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" event={"ID":"8fb07958-7e0d-4002-8a90-40f73d5aaebb","Type":"ContainerStarted","Data":"2fec0cc6fcd30de4fe561e0f5da3a656aa7adf0931e554cb377def83ab2bf456"} Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.462059 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" event={"ID":"8fb07958-7e0d-4002-8a90-40f73d5aaebb","Type":"ContainerStarted","Data":"2c7df23acdb759e3fdf18d6d8f620407de3f864aa084db7157f504d55528a7aa"} Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.466654 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.466732 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx66r\" (UniqueName: \"kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.466775 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.483955 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" podStartSLOduration=2.483935304 podStartE2EDuration="2.483935304s" podCreationTimestamp="2026-03-07 07:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:15.480779809 +0000 UTC m=+2193.945767130" watchObservedRunningTime="2026-03-07 07:36:15.483935304 +0000 UTC m=+2193.948922625" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.568144 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.568226 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.568331 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gx66r\" (UniqueName: \"kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.568757 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.568780 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.599951 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx66r\" (UniqueName: \"kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r\") pod \"redhat-marketplace-9fwcr\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:15 crc kubenswrapper[4738]: I0307 07:36:15.663295 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.118767 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.470127 4738 generic.go:334] "Generic (PLEG): container finished" podID="8fb07958-7e0d-4002-8a90-40f73d5aaebb" containerID="2fec0cc6fcd30de4fe561e0f5da3a656aa7adf0931e554cb377def83ab2bf456" exitCode=0 Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.470200 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" event={"ID":"8fb07958-7e0d-4002-8a90-40f73d5aaebb","Type":"ContainerDied","Data":"2fec0cc6fcd30de4fe561e0f5da3a656aa7adf0931e554cb377def83ab2bf456"} Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.471737 4738 generic.go:334] "Generic (PLEG): container finished" podID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerID="0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4" exitCode=0 Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.471983 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerDied","Data":"0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4"} Mar 07 07:36:16 crc kubenswrapper[4738]: I0307 07:36:16.472024 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerStarted","Data":"4973d7aa128ad891ab73530760803223c44300212d322c3da5bf0377dbb173b9"} Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.485468 4738 generic.go:334] "Generic (PLEG): container finished" podID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerID="669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f" exitCode=0 Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.485544 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerDied","Data":"669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f"} Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.782544 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.821540 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xng5r"] Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.829427 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xng5r"] Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.901345 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.902099 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.902321 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.902623 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z4ht\" (UniqueName: \"kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.902748 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.902899 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.903095 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf\") pod \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\" (UID: \"8fb07958-7e0d-4002-8a90-40f73d5aaebb\") " Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.903615 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8fb07958-7e0d-4002-8a90-40f73d5aaebb-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.903679 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.909503 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht" (OuterVolumeSpecName: "kube-api-access-2z4ht") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "kube-api-access-2z4ht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.928145 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts" (OuterVolumeSpecName: "scripts") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.928902 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:17 crc kubenswrapper[4738]: I0307 07:36:17.934407 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8fb07958-7e0d-4002-8a90-40f73d5aaebb" (UID: "8fb07958-7e0d-4002-8a90-40f73d5aaebb"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.005109 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z4ht\" (UniqueName: \"kubernetes.io/projected/8fb07958-7e0d-4002-8a90-40f73d5aaebb-kube-api-access-2z4ht\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.005142 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.005152 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.005181 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8fb07958-7e0d-4002-8a90-40f73d5aaebb-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.005192 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8fb07958-7e0d-4002-8a90-40f73d5aaebb-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.397488 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8fb07958-7e0d-4002-8a90-40f73d5aaebb" path="/var/lib/kubelet/pods/8fb07958-7e0d-4002-8a90-40f73d5aaebb/volumes" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.501892 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerStarted","Data":"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca"} Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.504527 4738 scope.go:117] "RemoveContainer" containerID="2fec0cc6fcd30de4fe561e0f5da3a656aa7adf0931e554cb377def83ab2bf456" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.504687 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xng5r" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.543531 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9fwcr" podStartSLOduration=1.77827693 podStartE2EDuration="3.543509254s" podCreationTimestamp="2026-03-07 07:36:15 +0000 UTC" firstStartedPulling="2026-03-07 07:36:16.473308588 +0000 UTC m=+2194.938295909" lastFinishedPulling="2026-03-07 07:36:18.238540902 +0000 UTC m=+2196.703528233" observedRunningTime="2026-03-07 07:36:18.540810661 +0000 UTC m=+2197.005798022" watchObservedRunningTime="2026-03-07 07:36:18.543509254 +0000 UTC m=+2197.008496585" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.943693 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-z82cg"] Mar 07 07:36:18 crc kubenswrapper[4738]: E0307 07:36:18.944239 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8fb07958-7e0d-4002-8a90-40f73d5aaebb" containerName="swift-ring-rebalance" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.944308 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8fb07958-7e0d-4002-8a90-40f73d5aaebb" containerName="swift-ring-rebalance" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.944524 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8fb07958-7e0d-4002-8a90-40f73d5aaebb" containerName="swift-ring-rebalance" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.945083 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.948428 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.949244 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:18 crc kubenswrapper[4738]: I0307 07:36:18.961584 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-z82cg"] Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.124441 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.124601 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.124816 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.124959 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9m8g\" (UniqueName: \"kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.125110 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.125187 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227378 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227461 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227538 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227600 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227642 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9m8g\" (UniqueName: \"kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.227695 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.228783 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.228903 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.229119 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.237826 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.238810 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.249474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9m8g\" (UniqueName: \"kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g\") pod \"swift-ring-rebalance-debug-z82cg\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.262236 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.701518 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.703571 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.728594 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.736072 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.736134 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wshb4\" (UniqueName: \"kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.736264 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.765587 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-z82cg"] Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.838374 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.838450 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.838480 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wshb4\" (UniqueName: \"kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.839284 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.840547 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:19 crc kubenswrapper[4738]: I0307 07:36:19.853991 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wshb4\" (UniqueName: \"kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4\") pod \"redhat-operators-djm66\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.022720 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.225726 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:20 crc kubenswrapper[4738]: W0307 07:36:20.226123 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0695881c_3f8c_4d8c_b173_58af1764b1a3.slice/crio-cbeb85f6a2df650d90f9e482123fb21ae76aa0834e1f242475ee2bc1e854715b WatchSource:0}: Error finding container cbeb85f6a2df650d90f9e482123fb21ae76aa0834e1f242475ee2bc1e854715b: Status 404 returned error can't find the container with id cbeb85f6a2df650d90f9e482123fb21ae76aa0834e1f242475ee2bc1e854715b Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.526672 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" event={"ID":"22c70b55-623e-4f20-8002-441edace5c46","Type":"ContainerStarted","Data":"02f982255b9dc309be78b72406d5192924fd1c2aa8dce3e2b2ed17c68eb7e777"} Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.526712 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" event={"ID":"22c70b55-623e-4f20-8002-441edace5c46","Type":"ContainerStarted","Data":"4ef8df08ba04e7c15762de8ade7a606543b6d5cef2ca73f2f67385d1a96dd1e7"} Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.528946 4738 generic.go:334] "Generic (PLEG): container finished" podID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerID="f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66" exitCode=0 Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.528987 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerDied","Data":"f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66"} Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.529011 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerStarted","Data":"cbeb85f6a2df650d90f9e482123fb21ae76aa0834e1f242475ee2bc1e854715b"} Mar 07 07:36:20 crc kubenswrapper[4738]: I0307 07:36:20.562372 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" podStartSLOduration=2.562357149 podStartE2EDuration="2.562357149s" podCreationTimestamp="2026-03-07 07:36:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:20.543103047 +0000 UTC m=+2199.008090368" watchObservedRunningTime="2026-03-07 07:36:20.562357149 +0000 UTC m=+2199.027344470" Mar 07 07:36:21 crc kubenswrapper[4738]: I0307 07:36:21.540062 4738 generic.go:334] "Generic (PLEG): container finished" podID="22c70b55-623e-4f20-8002-441edace5c46" containerID="02f982255b9dc309be78b72406d5192924fd1c2aa8dce3e2b2ed17c68eb7e777" exitCode=0 Mar 07 07:36:21 crc kubenswrapper[4738]: I0307 07:36:21.540248 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" event={"ID":"22c70b55-623e-4f20-8002-441edace5c46","Type":"ContainerDied","Data":"02f982255b9dc309be78b72406d5192924fd1c2aa8dce3e2b2ed17c68eb7e777"} Mar 07 07:36:21 crc kubenswrapper[4738]: I0307 07:36:21.542635 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerStarted","Data":"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273"} Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.815931 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.855023 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-z82cg"] Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.860799 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-z82cg"] Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.985687 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9m8g\" (UniqueName: \"kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.985761 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.985848 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.985957 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.985985 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.986032 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf\") pod \"22c70b55-623e-4f20-8002-441edace5c46\" (UID: \"22c70b55-623e-4f20-8002-441edace5c46\") " Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.986394 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.986889 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:22 crc kubenswrapper[4738]: I0307 07:36:22.994518 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g" (OuterVolumeSpecName: "kube-api-access-q9m8g") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "kube-api-access-q9m8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.007394 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts" (OuterVolumeSpecName: "scripts") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.011664 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.022899 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "22c70b55-623e-4f20-8002-441edace5c46" (UID: "22c70b55-623e-4f20-8002-441edace5c46"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088387 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088443 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22c70b55-623e-4f20-8002-441edace5c46-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088506 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22c70b55-623e-4f20-8002-441edace5c46-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088530 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088589 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9m8g\" (UniqueName: \"kubernetes.io/projected/22c70b55-623e-4f20-8002-441edace5c46-kube-api-access-q9m8g\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.088609 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22c70b55-623e-4f20-8002-441edace5c46-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.557124 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ef8df08ba04e7c15762de8ade7a606543b6d5cef2ca73f2f67385d1a96dd1e7" Mar 07 07:36:23 crc kubenswrapper[4738]: I0307 07:36:23.557177 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-z82cg" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.066595 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb"] Mar 07 07:36:24 crc kubenswrapper[4738]: E0307 07:36:24.067296 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22c70b55-623e-4f20-8002-441edace5c46" containerName="swift-ring-rebalance" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.067313 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="22c70b55-623e-4f20-8002-441edace5c46" containerName="swift-ring-rebalance" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.067506 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="22c70b55-623e-4f20-8002-441edace5c46" containerName="swift-ring-rebalance" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.068122 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.070280 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.071869 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.087022 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb"] Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.206497 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.206599 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.206638 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.206672 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.206981 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwfzv\" (UniqueName: \"kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.207084 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.307994 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwfzv\" (UniqueName: \"kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308043 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308080 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308158 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308208 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308246 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308737 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308965 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.308968 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.311922 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.313704 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.324141 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwfzv\" (UniqueName: \"kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv\") pod \"swift-ring-rebalance-debug-gzzmb\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.394858 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c70b55-623e-4f20-8002-441edace5c46" path="/var/lib/kubelet/pods/22c70b55-623e-4f20-8002-441edace5c46/volumes" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.404447 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.566897 4738 generic.go:334] "Generic (PLEG): container finished" podID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerID="964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273" exitCode=0 Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.566951 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerDied","Data":"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273"} Mar 07 07:36:24 crc kubenswrapper[4738]: I0307 07:36:24.817026 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb"] Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.577415 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" event={"ID":"0b9a04b9-386f-4b40-8956-5ad00543964b","Type":"ContainerStarted","Data":"cd95cba130211bae3383b5c0d3637bfa0f1c7dd50d66156770ba4c87008ab070"} Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.577780 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" event={"ID":"0b9a04b9-386f-4b40-8956-5ad00543964b","Type":"ContainerStarted","Data":"990659d65a18689965029e73fe2898581ad06c7fa00f1173cd4ddf5fefeff5f2"} Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.580564 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerStarted","Data":"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b"} Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.598771 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" podStartSLOduration=1.598754321 podStartE2EDuration="1.598754321s" podCreationTimestamp="2026-03-07 07:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:25.596981025 +0000 UTC m=+2204.061968386" watchObservedRunningTime="2026-03-07 07:36:25.598754321 +0000 UTC m=+2204.063741642" Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.625704 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-djm66" podStartSLOduration=2.083430866 podStartE2EDuration="6.62568794s" podCreationTimestamp="2026-03-07 07:36:19 +0000 UTC" firstStartedPulling="2026-03-07 07:36:20.530402622 +0000 UTC m=+2198.995389943" lastFinishedPulling="2026-03-07 07:36:25.072659696 +0000 UTC m=+2203.537647017" observedRunningTime="2026-03-07 07:36:25.618677052 +0000 UTC m=+2204.083664383" watchObservedRunningTime="2026-03-07 07:36:25.62568794 +0000 UTC m=+2204.090675261" Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.664453 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.664500 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:25 crc kubenswrapper[4738]: I0307 07:36:25.710329 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:26 crc kubenswrapper[4738]: I0307 07:36:26.589695 4738 generic.go:334] "Generic (PLEG): container finished" podID="0b9a04b9-386f-4b40-8956-5ad00543964b" containerID="cd95cba130211bae3383b5c0d3637bfa0f1c7dd50d66156770ba4c87008ab070" exitCode=0 Mar 07 07:36:26 crc kubenswrapper[4738]: I0307 07:36:26.589830 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" event={"ID":"0b9a04b9-386f-4b40-8956-5ad00543964b","Type":"ContainerDied","Data":"cd95cba130211bae3383b5c0d3637bfa0f1c7dd50d66156770ba4c87008ab070"} Mar 07 07:36:26 crc kubenswrapper[4738]: I0307 07:36:26.635578 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:27 crc kubenswrapper[4738]: I0307 07:36:27.939242 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:27 crc kubenswrapper[4738]: I0307 07:36:27.973494 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb"] Mar 07 07:36:27 crc kubenswrapper[4738]: I0307 07:36:27.980453 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb"] Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099433 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099484 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099516 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099547 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099659 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwfzv\" (UniqueName: \"kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.099682 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf\") pod \"0b9a04b9-386f-4b40-8956-5ad00543964b\" (UID: \"0b9a04b9-386f-4b40-8956-5ad00543964b\") " Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.100359 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.100835 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.104615 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv" (OuterVolumeSpecName: "kube-api-access-fwfzv") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "kube-api-access-fwfzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.119912 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts" (OuterVolumeSpecName: "scripts") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.121933 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.131366 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0b9a04b9-386f-4b40-8956-5ad00543964b" (UID: "0b9a04b9-386f-4b40-8956-5ad00543964b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200770 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200812 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0b9a04b9-386f-4b40-8956-5ad00543964b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200824 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200833 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0b9a04b9-386f-4b40-8956-5ad00543964b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200844 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwfzv\" (UniqueName: \"kubernetes.io/projected/0b9a04b9-386f-4b40-8956-5ad00543964b-kube-api-access-fwfzv\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.200853 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0b9a04b9-386f-4b40-8956-5ad00543964b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.394276 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9a04b9-386f-4b40-8956-5ad00543964b" path="/var/lib/kubelet/pods/0b9a04b9-386f-4b40-8956-5ad00543964b/volumes" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.485251 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.607629 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-gzzmb" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.607805 4738 scope.go:117] "RemoveContainer" containerID="cd95cba130211bae3383b5c0d3637bfa0f1c7dd50d66156770ba4c87008ab070" Mar 07 07:36:28 crc kubenswrapper[4738]: I0307 07:36:28.607968 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9fwcr" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="registry-server" containerID="cri-o://da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca" gracePeriod=2 Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.074626 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.117408 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities\") pod \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.117460 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content\") pod \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.117490 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gx66r\" (UniqueName: \"kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r\") pod \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\" (UID: \"98312b5a-358f-4fa8-ba48-96edf7bf4b0f\") " Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.118307 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities" (OuterVolumeSpecName: "utilities") pod "98312b5a-358f-4fa8-ba48-96edf7bf4b0f" (UID: "98312b5a-358f-4fa8-ba48-96edf7bf4b0f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.124404 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r" (OuterVolumeSpecName: "kube-api-access-gx66r") pod "98312b5a-358f-4fa8-ba48-96edf7bf4b0f" (UID: "98312b5a-358f-4fa8-ba48-96edf7bf4b0f"). InnerVolumeSpecName "kube-api-access-gx66r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.151999 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p"] Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152228 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "98312b5a-358f-4fa8-ba48-96edf7bf4b0f" (UID: "98312b5a-358f-4fa8-ba48-96edf7bf4b0f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.152367 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="registry-server" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152387 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="registry-server" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.152416 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9a04b9-386f-4b40-8956-5ad00543964b" containerName="swift-ring-rebalance" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152424 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9a04b9-386f-4b40-8956-5ad00543964b" containerName="swift-ring-rebalance" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.152441 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="extract-utilities" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152449 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="extract-utilities" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.152462 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="extract-content" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152470 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="extract-content" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152629 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerName="registry-server" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.152652 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9a04b9-386f-4b40-8956-5ad00543964b" containerName="swift-ring-rebalance" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.153203 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.156248 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.156916 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.176245 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p"] Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218505 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218557 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218581 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218625 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218655 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vngt\" (UniqueName: \"kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.218839 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.219146 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.219201 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.219218 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gx66r\" (UniqueName: \"kubernetes.io/projected/98312b5a-358f-4fa8-ba48-96edf7bf4b0f-kube-api-access-gx66r\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320555 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320608 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320637 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320658 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320689 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vngt\" (UniqueName: \"kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.320725 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.321357 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.321601 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.321610 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.323787 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.333596 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.347805 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vngt\" (UniqueName: \"kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt\") pod \"swift-ring-rebalance-debug-m7q6p\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.478066 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.636749 4738 generic.go:334] "Generic (PLEG): container finished" podID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" containerID="da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca" exitCode=0 Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.636796 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerDied","Data":"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca"} Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.636825 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9fwcr" event={"ID":"98312b5a-358f-4fa8-ba48-96edf7bf4b0f","Type":"ContainerDied","Data":"4973d7aa128ad891ab73530760803223c44300212d322c3da5bf0377dbb173b9"} Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.636829 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9fwcr" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.636844 4738 scope.go:117] "RemoveContainer" containerID="da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.667127 4738 scope.go:117] "RemoveContainer" containerID="669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.677441 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.682215 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9fwcr"] Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.692085 4738 scope.go:117] "RemoveContainer" containerID="0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.707959 4738 scope.go:117] "RemoveContainer" containerID="da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.708518 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca\": container with ID starting with da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca not found: ID does not exist" containerID="da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.708565 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca"} err="failed to get container status \"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca\": rpc error: code = NotFound desc = could not find container \"da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca\": container with ID starting with da0e3f25a2d6eee69ee3f71c16f504328d76124b0e3e9dcbd095fe56156695ca not found: ID does not exist" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.708594 4738 scope.go:117] "RemoveContainer" containerID="669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.708879 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f\": container with ID starting with 669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f not found: ID does not exist" containerID="669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.708914 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f"} err="failed to get container status \"669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f\": rpc error: code = NotFound desc = could not find container \"669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f\": container with ID starting with 669f54eb23a2e42fea9d92f574f6be1150e5f888d03ff2864cd068a4b835c31f not found: ID does not exist" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.708989 4738 scope.go:117] "RemoveContainer" containerID="0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4" Mar 07 07:36:29 crc kubenswrapper[4738]: E0307 07:36:29.709905 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4\": container with ID starting with 0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4 not found: ID does not exist" containerID="0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4" Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.709934 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4"} err="failed to get container status \"0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4\": rpc error: code = NotFound desc = could not find container \"0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4\": container with ID starting with 0e077c5e29c6974bebe4c6c660a062c8c4b07efe353fefb93dc8a809968883c4 not found: ID does not exist" Mar 07 07:36:29 crc kubenswrapper[4738]: W0307 07:36:29.907417 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b3a017e_cd4c_439a_b16a_89d7d8fa8d05.slice/crio-dfa817b7a1525827ae6e219093839b0a1955f5ebb261f4c811cb65ba03ea0e9e WatchSource:0}: Error finding container dfa817b7a1525827ae6e219093839b0a1955f5ebb261f4c811cb65ba03ea0e9e: Status 404 returned error can't find the container with id dfa817b7a1525827ae6e219093839b0a1955f5ebb261f4c811cb65ba03ea0e9e Mar 07 07:36:29 crc kubenswrapper[4738]: I0307 07:36:29.912053 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p"] Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.023322 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.023370 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.394607 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98312b5a-358f-4fa8-ba48-96edf7bf4b0f" path="/var/lib/kubelet/pods/98312b5a-358f-4fa8-ba48-96edf7bf4b0f/volumes" Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.648130 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" event={"ID":"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05","Type":"ContainerStarted","Data":"91c1fdf9fdf78e2c489b8b769f3bf378e60fb6a872f67d43d9fc4f7405cc9756"} Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.648569 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" event={"ID":"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05","Type":"ContainerStarted","Data":"dfa817b7a1525827ae6e219093839b0a1955f5ebb261f4c811cb65ba03ea0e9e"} Mar 07 07:36:30 crc kubenswrapper[4738]: I0307 07:36:30.667535 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" podStartSLOduration=1.667516731 podStartE2EDuration="1.667516731s" podCreationTimestamp="2026-03-07 07:36:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:30.661971423 +0000 UTC m=+2209.126958744" watchObservedRunningTime="2026-03-07 07:36:30.667516731 +0000 UTC m=+2209.132504042" Mar 07 07:36:31 crc kubenswrapper[4738]: I0307 07:36:31.079549 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-djm66" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="registry-server" probeResult="failure" output=< Mar 07 07:36:31 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:36:31 crc kubenswrapper[4738]: > Mar 07 07:36:31 crc kubenswrapper[4738]: I0307 07:36:31.658188 4738 generic.go:334] "Generic (PLEG): container finished" podID="7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" containerID="91c1fdf9fdf78e2c489b8b769f3bf378e60fb6a872f67d43d9fc4f7405cc9756" exitCode=0 Mar 07 07:36:31 crc kubenswrapper[4738]: I0307 07:36:31.658251 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" event={"ID":"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05","Type":"ContainerDied","Data":"91c1fdf9fdf78e2c489b8b769f3bf378e60fb6a872f67d43d9fc4f7405cc9756"} Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.057431 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.072782 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.072869 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vngt\" (UniqueName: \"kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.072913 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.072999 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.073029 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.073068 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf\") pod \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\" (UID: \"7b3a017e-cd4c-439a-b16a-89d7d8fa8d05\") " Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.073799 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.074487 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.078007 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt" (OuterVolumeSpecName: "kube-api-access-7vngt") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "kube-api-access-7vngt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.094023 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p"] Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.100013 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts" (OuterVolumeSpecName: "scripts") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.100141 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p"] Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.103293 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.109337 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" (UID: "7b3a017e-cd4c-439a-b16a-89d7d8fa8d05"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173904 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173945 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vngt\" (UniqueName: \"kubernetes.io/projected/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-kube-api-access-7vngt\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173957 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173967 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173978 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.173986 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.685356 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfa817b7a1525827ae6e219093839b0a1955f5ebb261f4c811cb65ba03ea0e9e" Mar 07 07:36:33 crc kubenswrapper[4738]: I0307 07:36:33.685430 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-m7q6p" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.260371 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjssh"] Mar 07 07:36:34 crc kubenswrapper[4738]: E0307 07:36:34.260902 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" containerName="swift-ring-rebalance" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.260914 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" containerName="swift-ring-rebalance" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.261060 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" containerName="swift-ring-rebalance" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.261573 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.263217 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.263656 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.268560 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjssh"] Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.391816 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjtq7\" (UniqueName: \"kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.391885 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.391923 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.392093 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.392135 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.392183 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.393763 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b3a017e-cd4c-439a-b16a-89d7d8fa8d05" path="/var/lib/kubelet/pods/7b3a017e-cd4c-439a-b16a-89d7d8fa8d05/volumes" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.493872 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.493941 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.493996 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494047 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494073 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494114 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjtq7\" (UniqueName: \"kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494550 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494918 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.494962 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.498829 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.499660 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.511350 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjtq7\" (UniqueName: \"kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7\") pod \"swift-ring-rebalance-debug-hjssh\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:34 crc kubenswrapper[4738]: I0307 07:36:34.585034 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:35 crc kubenswrapper[4738]: I0307 07:36:35.080333 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjssh"] Mar 07 07:36:35 crc kubenswrapper[4738]: W0307 07:36:35.084451 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod57db2fe4_bfc6_499b_a56f_a09824d40ad5.slice/crio-df878b531e0c084015f6033898747380e3f66b5b3b6ed06c87ec9f9e2953e888 WatchSource:0}: Error finding container df878b531e0c084015f6033898747380e3f66b5b3b6ed06c87ec9f9e2953e888: Status 404 returned error can't find the container with id df878b531e0c084015f6033898747380e3f66b5b3b6ed06c87ec9f9e2953e888 Mar 07 07:36:35 crc kubenswrapper[4738]: I0307 07:36:35.760274 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" event={"ID":"57db2fe4-bfc6-499b-a56f-a09824d40ad5","Type":"ContainerStarted","Data":"8d03b7af0da61c8a2889c272c8a0685c54a5a34ff9d95e348146d587fd0a4a54"} Mar 07 07:36:35 crc kubenswrapper[4738]: I0307 07:36:35.760785 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" event={"ID":"57db2fe4-bfc6-499b-a56f-a09824d40ad5","Type":"ContainerStarted","Data":"df878b531e0c084015f6033898747380e3f66b5b3b6ed06c87ec9f9e2953e888"} Mar 07 07:36:35 crc kubenswrapper[4738]: I0307 07:36:35.789306 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" podStartSLOduration=1.789281374 podStartE2EDuration="1.789281374s" podCreationTimestamp="2026-03-07 07:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:35.77941719 +0000 UTC m=+2214.244404521" watchObservedRunningTime="2026-03-07 07:36:35.789281374 +0000 UTC m=+2214.254268705" Mar 07 07:36:36 crc kubenswrapper[4738]: I0307 07:36:36.773359 4738 generic.go:334] "Generic (PLEG): container finished" podID="57db2fe4-bfc6-499b-a56f-a09824d40ad5" containerID="8d03b7af0da61c8a2889c272c8a0685c54a5a34ff9d95e348146d587fd0a4a54" exitCode=0 Mar 07 07:36:36 crc kubenswrapper[4738]: I0307 07:36:36.773432 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" event={"ID":"57db2fe4-bfc6-499b-a56f-a09824d40ad5","Type":"ContainerDied","Data":"8d03b7af0da61c8a2889c272c8a0685c54a5a34ff9d95e348146d587fd0a4a54"} Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.034142 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.062838 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjssh"] Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.068199 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hjssh"] Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150398 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150451 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150741 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150788 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjtq7\" (UniqueName: \"kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150872 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.150898 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts\") pod \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\" (UID: \"57db2fe4-bfc6-499b-a56f-a09824d40ad5\") " Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.152227 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.152463 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.156495 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7" (OuterVolumeSpecName: "kube-api-access-mjtq7") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "kube-api-access-mjtq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.171967 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.172462 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.174626 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts" (OuterVolumeSpecName: "scripts") pod "57db2fe4-bfc6-499b-a56f-a09824d40ad5" (UID: "57db2fe4-bfc6-499b-a56f-a09824d40ad5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.252944 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjtq7\" (UniqueName: \"kubernetes.io/projected/57db2fe4-bfc6-499b-a56f-a09824d40ad5-kube-api-access-mjtq7\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.252982 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.252991 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/57db2fe4-bfc6-499b-a56f-a09824d40ad5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.252999 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/57db2fe4-bfc6-499b-a56f-a09824d40ad5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.253009 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.253018 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/57db2fe4-bfc6-499b-a56f-a09824d40ad5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.403861 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57db2fe4-bfc6-499b-a56f-a09824d40ad5" path="/var/lib/kubelet/pods/57db2fe4-bfc6-499b-a56f-a09824d40ad5/volumes" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.792353 4738 scope.go:117] "RemoveContainer" containerID="8d03b7af0da61c8a2889c272c8a0685c54a5a34ff9d95e348146d587fd0a4a54" Mar 07 07:36:38 crc kubenswrapper[4738]: I0307 07:36:38.792567 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hjssh" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.256470 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54z5x"] Mar 07 07:36:39 crc kubenswrapper[4738]: E0307 07:36:39.256818 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57db2fe4-bfc6-499b-a56f-a09824d40ad5" containerName="swift-ring-rebalance" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.256834 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="57db2fe4-bfc6-499b-a56f-a09824d40ad5" containerName="swift-ring-rebalance" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.257062 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="57db2fe4-bfc6-499b-a56f-a09824d40ad5" containerName="swift-ring-rebalance" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.258474 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.261497 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.261731 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.265663 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54z5x"] Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370301 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370374 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370612 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370680 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370845 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hdz\" (UniqueName: \"kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.370888 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472705 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472797 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472826 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472860 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472876 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.472916 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hdz\" (UniqueName: \"kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.473546 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.473671 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.473772 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.477060 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.477427 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.492899 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hdz\" (UniqueName: \"kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz\") pod \"swift-ring-rebalance-debug-54z5x\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:39 crc kubenswrapper[4738]: I0307 07:36:39.582413 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.054414 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54z5x"] Mar 07 07:36:40 crc kubenswrapper[4738]: W0307 07:36:40.060811 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65e83d38_703c_4813_ae93_d27da8a242d5.slice/crio-06f87b62213a2919c3a6496eedaa601345ed040c7f5b03ae744576fb8439c38f WatchSource:0}: Error finding container 06f87b62213a2919c3a6496eedaa601345ed040c7f5b03ae744576fb8439c38f: Status 404 returned error can't find the container with id 06f87b62213a2919c3a6496eedaa601345ed040c7f5b03ae744576fb8439c38f Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.081848 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.134384 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.821592 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" event={"ID":"65e83d38-703c-4813-ae93-d27da8a242d5","Type":"ContainerStarted","Data":"e7b1ff02080c8489ff785e1ab9c168a5705a38b13bf7679b835d07fcafff69f4"} Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.821865 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" event={"ID":"65e83d38-703c-4813-ae93-d27da8a242d5","Type":"ContainerStarted","Data":"06f87b62213a2919c3a6496eedaa601345ed040c7f5b03ae744576fb8439c38f"} Mar 07 07:36:40 crc kubenswrapper[4738]: I0307 07:36:40.842332 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" podStartSLOduration=1.842313015 podStartE2EDuration="1.842313015s" podCreationTimestamp="2026-03-07 07:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:40.839272034 +0000 UTC m=+2219.304259375" watchObservedRunningTime="2026-03-07 07:36:40.842313015 +0000 UTC m=+2219.307300356" Mar 07 07:36:41 crc kubenswrapper[4738]: I0307 07:36:41.832461 4738 generic.go:334] "Generic (PLEG): container finished" podID="65e83d38-703c-4813-ae93-d27da8a242d5" containerID="e7b1ff02080c8489ff785e1ab9c168a5705a38b13bf7679b835d07fcafff69f4" exitCode=0 Mar 07 07:36:41 crc kubenswrapper[4738]: I0307 07:36:41.832509 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" event={"ID":"65e83d38-703c-4813-ae93-d27da8a242d5","Type":"ContainerDied","Data":"e7b1ff02080c8489ff785e1ab9c168a5705a38b13bf7679b835d07fcafff69f4"} Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.266789 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.290882 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54z5x"] Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.298577 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54z5x"] Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326534 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326580 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h4hdz\" (UniqueName: \"kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326610 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326671 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326695 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.326805 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices\") pod \"65e83d38-703c-4813-ae93-d27da8a242d5\" (UID: \"65e83d38-703c-4813-ae93-d27da8a242d5\") " Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.327183 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.327476 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.334325 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz" (OuterVolumeSpecName: "kube-api-access-h4hdz") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "kube-api-access-h4hdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.344791 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts" (OuterVolumeSpecName: "scripts") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.350327 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.362899 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "65e83d38-703c-4813-ae93-d27da8a242d5" (UID: "65e83d38-703c-4813-ae93-d27da8a242d5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428369 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/65e83d38-703c-4813-ae93-d27da8a242d5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428408 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h4hdz\" (UniqueName: \"kubernetes.io/projected/65e83d38-703c-4813-ae93-d27da8a242d5-kube-api-access-h4hdz\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428420 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428429 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428440 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/65e83d38-703c-4813-ae93-d27da8a242d5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.428450 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/65e83d38-703c-4813-ae93-d27da8a242d5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.856521 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06f87b62213a2919c3a6496eedaa601345ed040c7f5b03ae744576fb8439c38f" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.856643 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54z5x" Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.885573 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:43 crc kubenswrapper[4738]: I0307 07:36:43.885799 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-djm66" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="registry-server" containerID="cri-o://52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b" gracePeriod=2 Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.316002 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.394323 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65e83d38-703c-4813-ae93-d27da8a242d5" path="/var/lib/kubelet/pods/65e83d38-703c-4813-ae93-d27da8a242d5/volumes" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439009 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6"] Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.439298 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="extract-content" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439310 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="extract-content" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.439323 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="extract-utilities" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439330 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="extract-utilities" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.439344 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65e83d38-703c-4813-ae93-d27da8a242d5" containerName="swift-ring-rebalance" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439349 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="65e83d38-703c-4813-ae93-d27da8a242d5" containerName="swift-ring-rebalance" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.439368 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="registry-server" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439375 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="registry-server" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439583 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="65e83d38-703c-4813-ae93-d27da8a242d5" containerName="swift-ring-rebalance" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.439609 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerName="registry-server" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.440808 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.441650 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content\") pod \"0695881c-3f8c-4d8c-b173-58af1764b1a3\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.441749 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities\") pod \"0695881c-3f8c-4d8c-b173-58af1764b1a3\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.441787 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wshb4\" (UniqueName: \"kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4\") pod \"0695881c-3f8c-4d8c-b173-58af1764b1a3\" (UID: \"0695881c-3f8c-4d8c-b173-58af1764b1a3\") " Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.442616 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities" (OuterVolumeSpecName: "utilities") pod "0695881c-3f8c-4d8c-b173-58af1764b1a3" (UID: "0695881c-3f8c-4d8c-b173-58af1764b1a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.446012 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.446191 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.447489 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4" (OuterVolumeSpecName: "kube-api-access-wshb4") pod "0695881c-3f8c-4d8c-b173-58af1764b1a3" (UID: "0695881c-3f8c-4d8c-b173-58af1764b1a3"). InnerVolumeSpecName "kube-api-access-wshb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.449005 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6"] Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.543506 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnpfs\" (UniqueName: \"kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.543587 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.543710 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.543986 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.544048 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.544119 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.544313 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.544325 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wshb4\" (UniqueName: \"kubernetes.io/projected/0695881c-3f8c-4d8c-b173-58af1764b1a3-kube-api-access-wshb4\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.582386 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0695881c-3f8c-4d8c-b173-58af1764b1a3" (UID: "0695881c-3f8c-4d8c-b173-58af1764b1a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645783 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645863 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnpfs\" (UniqueName: \"kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645910 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645934 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645955 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.645974 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.646013 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0695881c-3f8c-4d8c-b173-58af1764b1a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.646523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.646671 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.646695 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.649302 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.650564 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.664376 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnpfs\" (UniqueName: \"kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs\") pod \"swift-ring-rebalance-debug-7b9c6\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.789664 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.868748 4738 generic.go:334] "Generic (PLEG): container finished" podID="0695881c-3f8c-4d8c-b173-58af1764b1a3" containerID="52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b" exitCode=0 Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.868791 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerDied","Data":"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b"} Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.868817 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djm66" event={"ID":"0695881c-3f8c-4d8c-b173-58af1764b1a3","Type":"ContainerDied","Data":"cbeb85f6a2df650d90f9e482123fb21ae76aa0834e1f242475ee2bc1e854715b"} Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.868832 4738 scope.go:117] "RemoveContainer" containerID="52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.868945 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djm66" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.910568 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.924099 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-djm66"] Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.926251 4738 scope.go:117] "RemoveContainer" containerID="964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.952507 4738 scope.go:117] "RemoveContainer" containerID="f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.980473 4738 scope.go:117] "RemoveContainer" containerID="52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.980887 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b\": container with ID starting with 52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b not found: ID does not exist" containerID="52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.980918 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b"} err="failed to get container status \"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b\": rpc error: code = NotFound desc = could not find container \"52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b\": container with ID starting with 52d4b82a4459293b20285d25c24ac424cd3c48bbea649c4023fc985ea7760f6b not found: ID does not exist" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.980939 4738 scope.go:117] "RemoveContainer" containerID="964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.981316 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273\": container with ID starting with 964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273 not found: ID does not exist" containerID="964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.981386 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273"} err="failed to get container status \"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273\": rpc error: code = NotFound desc = could not find container \"964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273\": container with ID starting with 964850aaab647ea7f9d425dc1fe5c6f59e9d3b713a050bf1500bf85fa630b273 not found: ID does not exist" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.981416 4738 scope.go:117] "RemoveContainer" containerID="f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66" Mar 07 07:36:44 crc kubenswrapper[4738]: E0307 07:36:44.981786 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66\": container with ID starting with f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66 not found: ID does not exist" containerID="f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66" Mar 07 07:36:44 crc kubenswrapper[4738]: I0307 07:36:44.981822 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66"} err="failed to get container status \"f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66\": rpc error: code = NotFound desc = could not find container \"f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66\": container with ID starting with f9cdc0e1830ecd31281e0150d5cee15eb69a17d289ce28c460ee9cdde8a86a66 not found: ID does not exist" Mar 07 07:36:45 crc kubenswrapper[4738]: I0307 07:36:45.273977 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6"] Mar 07 07:36:45 crc kubenswrapper[4738]: W0307 07:36:45.284017 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b770e3b_9ce0_4ca2_807e_96af3daf75a7.slice/crio-de77891b61a6e760994f480f050003e4065aa315d5d3dc91d0d7cb396d4ac98e WatchSource:0}: Error finding container de77891b61a6e760994f480f050003e4065aa315d5d3dc91d0d7cb396d4ac98e: Status 404 returned error can't find the container with id de77891b61a6e760994f480f050003e4065aa315d5d3dc91d0d7cb396d4ac98e Mar 07 07:36:45 crc kubenswrapper[4738]: I0307 07:36:45.882274 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" event={"ID":"6b770e3b-9ce0-4ca2-807e-96af3daf75a7","Type":"ContainerStarted","Data":"c22601e3b6ae8c553b2b5ad10c18a384d9dbeed113a6a1852c0270a3768a2516"} Mar 07 07:36:45 crc kubenswrapper[4738]: I0307 07:36:45.882697 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" event={"ID":"6b770e3b-9ce0-4ca2-807e-96af3daf75a7","Type":"ContainerStarted","Data":"de77891b61a6e760994f480f050003e4065aa315d5d3dc91d0d7cb396d4ac98e"} Mar 07 07:36:45 crc kubenswrapper[4738]: I0307 07:36:45.935966 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" podStartSLOduration=1.935936678 podStartE2EDuration="1.935936678s" podCreationTimestamp="2026-03-07 07:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:45.904309595 +0000 UTC m=+2224.369296936" watchObservedRunningTime="2026-03-07 07:36:45.935936678 +0000 UTC m=+2224.400924019" Mar 07 07:36:46 crc kubenswrapper[4738]: I0307 07:36:46.308645 4738 scope.go:117] "RemoveContainer" containerID="58d77757efa2e302020f8962dbd35434698242e8e66abb6fd854537f3f5dc7a0" Mar 07 07:36:46 crc kubenswrapper[4738]: I0307 07:36:46.398264 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0695881c-3f8c-4d8c-b173-58af1764b1a3" path="/var/lib/kubelet/pods/0695881c-3f8c-4d8c-b173-58af1764b1a3/volumes" Mar 07 07:36:46 crc kubenswrapper[4738]: I0307 07:36:46.898313 4738 generic.go:334] "Generic (PLEG): container finished" podID="6b770e3b-9ce0-4ca2-807e-96af3daf75a7" containerID="c22601e3b6ae8c553b2b5ad10c18a384d9dbeed113a6a1852c0270a3768a2516" exitCode=0 Mar 07 07:36:46 crc kubenswrapper[4738]: I0307 07:36:46.898383 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" event={"ID":"6b770e3b-9ce0-4ca2-807e-96af3daf75a7","Type":"ContainerDied","Data":"c22601e3b6ae8c553b2b5ad10c18a384d9dbeed113a6a1852c0270a3768a2516"} Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.197262 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.232482 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6"] Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.239228 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6"] Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.303774 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.304137 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.304213 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.304254 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.304303 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bnpfs\" (UniqueName: \"kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.304348 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift\") pod \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\" (UID: \"6b770e3b-9ce0-4ca2-807e-96af3daf75a7\") " Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.305043 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.305128 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.305451 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.305476 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.308741 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs" (OuterVolumeSpecName: "kube-api-access-bnpfs") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "kube-api-access-bnpfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.321628 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts" (OuterVolumeSpecName: "scripts") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.326814 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.331112 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6b770e3b-9ce0-4ca2-807e-96af3daf75a7" (UID: "6b770e3b-9ce0-4ca2-807e-96af3daf75a7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.397367 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b770e3b-9ce0-4ca2-807e-96af3daf75a7" path="/var/lib/kubelet/pods/6b770e3b-9ce0-4ca2-807e-96af3daf75a7/volumes" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.406514 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.406550 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bnpfs\" (UniqueName: \"kubernetes.io/projected/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-kube-api-access-bnpfs\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.406560 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.406569 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b770e3b-9ce0-4ca2-807e-96af3daf75a7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.921838 4738 scope.go:117] "RemoveContainer" containerID="c22601e3b6ae8c553b2b5ad10c18a384d9dbeed113a6a1852c0270a3768a2516" Mar 07 07:36:48 crc kubenswrapper[4738]: I0307 07:36:48.922222 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7b9c6" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.394375 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv"] Mar 07 07:36:49 crc kubenswrapper[4738]: E0307 07:36:49.394772 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b770e3b-9ce0-4ca2-807e-96af3daf75a7" containerName="swift-ring-rebalance" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.394790 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b770e3b-9ce0-4ca2-807e-96af3daf75a7" containerName="swift-ring-rebalance" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.395006 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b770e3b-9ce0-4ca2-807e-96af3daf75a7" containerName="swift-ring-rebalance" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.395693 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.397973 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.401047 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.404549 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv"] Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.523404 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.523500 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.523630 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.523715 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.523972 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsfr7\" (UniqueName: \"kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.524113 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.625409 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.625989 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsfr7\" (UniqueName: \"kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.626127 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.626251 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.626409 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.626491 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.625830 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.627093 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.627731 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.633535 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.633987 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.644609 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsfr7\" (UniqueName: \"kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7\") pod \"swift-ring-rebalance-debug-8sqhv\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:49 crc kubenswrapper[4738]: I0307 07:36:49.710476 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:50 crc kubenswrapper[4738]: I0307 07:36:50.173175 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv"] Mar 07 07:36:50 crc kubenswrapper[4738]: I0307 07:36:50.946964 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" event={"ID":"f7a5a468-1ba2-4aae-9fa4-48168172dc14","Type":"ContainerStarted","Data":"561f066cbf67bfe70ed5681bc18474871b252207a764fa4fd80eec35aab167ca"} Mar 07 07:36:50 crc kubenswrapper[4738]: I0307 07:36:50.947614 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" event={"ID":"f7a5a468-1ba2-4aae-9fa4-48168172dc14","Type":"ContainerStarted","Data":"712efb8b4a7f7f9153e16d674008694c77913774a0b255305aa30b81c43c4c61"} Mar 07 07:36:50 crc kubenswrapper[4738]: I0307 07:36:50.968469 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" podStartSLOduration=1.968449951 podStartE2EDuration="1.968449951s" podCreationTimestamp="2026-03-07 07:36:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:50.965612246 +0000 UTC m=+2229.430599577" watchObservedRunningTime="2026-03-07 07:36:50.968449951 +0000 UTC m=+2229.433437272" Mar 07 07:36:51 crc kubenswrapper[4738]: I0307 07:36:51.961521 4738 generic.go:334] "Generic (PLEG): container finished" podID="f7a5a468-1ba2-4aae-9fa4-48168172dc14" containerID="561f066cbf67bfe70ed5681bc18474871b252207a764fa4fd80eec35aab167ca" exitCode=0 Mar 07 07:36:51 crc kubenswrapper[4738]: I0307 07:36:51.961576 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" event={"ID":"f7a5a468-1ba2-4aae-9fa4-48168172dc14","Type":"ContainerDied","Data":"561f066cbf67bfe70ed5681bc18474871b252207a764fa4fd80eec35aab167ca"} Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.331888 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.368083 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv"] Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.368147 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv"] Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401692 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401785 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401809 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401828 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401846 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsfr7\" (UniqueName: \"kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.401864 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf\") pod \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\" (UID: \"f7a5a468-1ba2-4aae-9fa4-48168172dc14\") " Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.403669 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.404244 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.416936 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7" (OuterVolumeSpecName: "kube-api-access-zsfr7") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "kube-api-access-zsfr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.424807 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts" (OuterVolumeSpecName: "scripts") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.426438 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.428221 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f7a5a468-1ba2-4aae-9fa4-48168172dc14" (UID: "f7a5a468-1ba2-4aae-9fa4-48168172dc14"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.502789 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.503045 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f7a5a468-1ba2-4aae-9fa4-48168172dc14-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.503179 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f7a5a468-1ba2-4aae-9fa4-48168172dc14-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.503268 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsfr7\" (UniqueName: \"kubernetes.io/projected/f7a5a468-1ba2-4aae-9fa4-48168172dc14-kube-api-access-zsfr7\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.503470 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.503546 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f7a5a468-1ba2-4aae-9fa4-48168172dc14-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.989122 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="712efb8b4a7f7f9153e16d674008694c77913774a0b255305aa30b81c43c4c61" Mar 07 07:36:53 crc kubenswrapper[4738]: I0307 07:36:53.989269 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8sqhv" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.397555 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7a5a468-1ba2-4aae-9fa4-48168172dc14" path="/var/lib/kubelet/pods/f7a5a468-1ba2-4aae-9fa4-48168172dc14/volumes" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.485222 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt"] Mar 07 07:36:54 crc kubenswrapper[4738]: E0307 07:36:54.485552 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7a5a468-1ba2-4aae-9fa4-48168172dc14" containerName="swift-ring-rebalance" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.485576 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7a5a468-1ba2-4aae-9fa4-48168172dc14" containerName="swift-ring-rebalance" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.485721 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7a5a468-1ba2-4aae-9fa4-48168172dc14" containerName="swift-ring-rebalance" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.486187 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.488309 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.489377 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.497482 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt"] Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518506 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518580 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518667 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518716 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518755 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.518779 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srvr6\" (UniqueName: \"kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619734 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619801 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619863 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619901 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619940 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.619966 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srvr6\" (UniqueName: \"kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.620677 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.620940 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.621108 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.625089 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.627972 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.638749 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srvr6\" (UniqueName: \"kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6\") pod \"swift-ring-rebalance-debug-b9sbt\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:54 crc kubenswrapper[4738]: I0307 07:36:54.851046 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:55 crc kubenswrapper[4738]: I0307 07:36:55.307171 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt"] Mar 07 07:36:56 crc kubenswrapper[4738]: I0307 07:36:56.025539 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" event={"ID":"7d32286f-2ab5-48b7-b13b-15889c89435e","Type":"ContainerStarted","Data":"4f7832d62a82709b88926b7516b6cc2660470adbc0a1edcc9db6c1c66d22a664"} Mar 07 07:36:56 crc kubenswrapper[4738]: I0307 07:36:56.026035 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" event={"ID":"7d32286f-2ab5-48b7-b13b-15889c89435e","Type":"ContainerStarted","Data":"f587884cac5931f4d45134c3958775f4b5ac08c435f73896de866799f8503d25"} Mar 07 07:36:56 crc kubenswrapper[4738]: I0307 07:36:56.051506 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" podStartSLOduration=2.051483063 podStartE2EDuration="2.051483063s" podCreationTimestamp="2026-03-07 07:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:36:56.047583979 +0000 UTC m=+2234.512571310" watchObservedRunningTime="2026-03-07 07:36:56.051483063 +0000 UTC m=+2234.516470404" Mar 07 07:36:56 crc kubenswrapper[4738]: I0307 07:36:56.957630 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:36:56 crc kubenswrapper[4738]: I0307 07:36:56.957935 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:36:57 crc kubenswrapper[4738]: I0307 07:36:57.039870 4738 generic.go:334] "Generic (PLEG): container finished" podID="7d32286f-2ab5-48b7-b13b-15889c89435e" containerID="4f7832d62a82709b88926b7516b6cc2660470adbc0a1edcc9db6c1c66d22a664" exitCode=0 Mar 07 07:36:57 crc kubenswrapper[4738]: I0307 07:36:57.039921 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" event={"ID":"7d32286f-2ab5-48b7-b13b-15889c89435e","Type":"ContainerDied","Data":"4f7832d62a82709b88926b7516b6cc2660470adbc0a1edcc9db6c1c66d22a664"} Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.372615 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.413356 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt"] Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.419508 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt"] Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519471 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519535 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519617 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srvr6\" (UniqueName: \"kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519692 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519741 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.519765 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices\") pod \"7d32286f-2ab5-48b7-b13b-15889c89435e\" (UID: \"7d32286f-2ab5-48b7-b13b-15889c89435e\") " Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.520627 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.521334 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.526011 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6" (OuterVolumeSpecName: "kube-api-access-srvr6") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "kube-api-access-srvr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.540927 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts" (OuterVolumeSpecName: "scripts") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.554573 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.556593 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7d32286f-2ab5-48b7-b13b-15889c89435e" (UID: "7d32286f-2ab5-48b7-b13b-15889c89435e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622422 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srvr6\" (UniqueName: \"kubernetes.io/projected/7d32286f-2ab5-48b7-b13b-15889c89435e-kube-api-access-srvr6\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622458 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d32286f-2ab5-48b7-b13b-15889c89435e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622471 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622483 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622497 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d32286f-2ab5-48b7-b13b-15889c89435e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:58 crc kubenswrapper[4738]: I0307 07:36:58.622508 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d32286f-2ab5-48b7-b13b-15889c89435e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.058487 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f587884cac5931f4d45134c3958775f4b5ac08c435f73896de866799f8503d25" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.058561 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b9sbt" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.574365 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs"] Mar 07 07:36:59 crc kubenswrapper[4738]: E0307 07:36:59.575090 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d32286f-2ab5-48b7-b13b-15889c89435e" containerName="swift-ring-rebalance" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.575107 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d32286f-2ab5-48b7-b13b-15889c89435e" containerName="swift-ring-rebalance" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.575342 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d32286f-2ab5-48b7-b13b-15889c89435e" containerName="swift-ring-rebalance" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.575994 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.580839 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.581087 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.602088 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs"] Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641554 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641597 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641617 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xctt\" (UniqueName: \"kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641684 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641706 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.641722 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743638 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743697 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743720 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743785 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743811 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.743832 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xctt\" (UniqueName: \"kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.744613 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.745206 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.745493 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.749834 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.749883 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.769421 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xctt\" (UniqueName: \"kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt\") pod \"swift-ring-rebalance-debug-pw7fs\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:36:59 crc kubenswrapper[4738]: I0307 07:36:59.905091 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:37:00 crc kubenswrapper[4738]: I0307 07:37:00.326908 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs"] Mar 07 07:37:00 crc kubenswrapper[4738]: W0307 07:37:00.333758 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb840c41c_82bf_4e91_bcd9_fbf75322e299.slice/crio-7f4e8ebc0bccb94c9f7109fa297ddf90a94070e4e82aaa21325cc0cedd7c0552 WatchSource:0}: Error finding container 7f4e8ebc0bccb94c9f7109fa297ddf90a94070e4e82aaa21325cc0cedd7c0552: Status 404 returned error can't find the container with id 7f4e8ebc0bccb94c9f7109fa297ddf90a94070e4e82aaa21325cc0cedd7c0552 Mar 07 07:37:00 crc kubenswrapper[4738]: I0307 07:37:00.396855 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d32286f-2ab5-48b7-b13b-15889c89435e" path="/var/lib/kubelet/pods/7d32286f-2ab5-48b7-b13b-15889c89435e/volumes" Mar 07 07:37:01 crc kubenswrapper[4738]: I0307 07:37:01.083324 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" event={"ID":"b840c41c-82bf-4e91-bcd9-fbf75322e299","Type":"ContainerStarted","Data":"cee3ce6883eddcd166e4b294be70391336e58febb18ad07a6bd8b0bc7253db5a"} Mar 07 07:37:01 crc kubenswrapper[4738]: I0307 07:37:01.083375 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" event={"ID":"b840c41c-82bf-4e91-bcd9-fbf75322e299","Type":"ContainerStarted","Data":"7f4e8ebc0bccb94c9f7109fa297ddf90a94070e4e82aaa21325cc0cedd7c0552"} Mar 07 07:37:01 crc kubenswrapper[4738]: I0307 07:37:01.105104 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" podStartSLOduration=2.105081839 podStartE2EDuration="2.105081839s" podCreationTimestamp="2026-03-07 07:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:01.102361626 +0000 UTC m=+2239.567348967" watchObservedRunningTime="2026-03-07 07:37:01.105081839 +0000 UTC m=+2239.570069170" Mar 07 07:37:02 crc kubenswrapper[4738]: I0307 07:37:02.091409 4738 generic.go:334] "Generic (PLEG): container finished" podID="b840c41c-82bf-4e91-bcd9-fbf75322e299" containerID="cee3ce6883eddcd166e4b294be70391336e58febb18ad07a6bd8b0bc7253db5a" exitCode=0 Mar 07 07:37:02 crc kubenswrapper[4738]: I0307 07:37:02.091458 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" event={"ID":"b840c41c-82bf-4e91-bcd9-fbf75322e299","Type":"ContainerDied","Data":"cee3ce6883eddcd166e4b294be70391336e58febb18ad07a6bd8b0bc7253db5a"} Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.388069 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.417783 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs"] Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.423425 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs"] Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.499930 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.500030 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xctt\" (UniqueName: \"kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.500094 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.500118 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.500172 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.500226 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts\") pod \"b840c41c-82bf-4e91-bcd9-fbf75322e299\" (UID: \"b840c41c-82bf-4e91-bcd9-fbf75322e299\") " Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.501007 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.501005 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.501369 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b840c41c-82bf-4e91-bcd9-fbf75322e299-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.501396 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.505840 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt" (OuterVolumeSpecName: "kube-api-access-4xctt") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "kube-api-access-4xctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.524299 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.525358 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts" (OuterVolumeSpecName: "scripts") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.528911 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b840c41c-82bf-4e91-bcd9-fbf75322e299" (UID: "b840c41c-82bf-4e91-bcd9-fbf75322e299"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.603225 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xctt\" (UniqueName: \"kubernetes.io/projected/b840c41c-82bf-4e91-bcd9-fbf75322e299-kube-api-access-4xctt\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.603262 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.603273 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b840c41c-82bf-4e91-bcd9-fbf75322e299-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:03 crc kubenswrapper[4738]: I0307 07:37:03.603282 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b840c41c-82bf-4e91-bcd9-fbf75322e299-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.135881 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f4e8ebc0bccb94c9f7109fa297ddf90a94070e4e82aaa21325cc0cedd7c0552" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.135977 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pw7fs" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.397928 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b840c41c-82bf-4e91-bcd9-fbf75322e299" path="/var/lib/kubelet/pods/b840c41c-82bf-4e91-bcd9-fbf75322e299/volumes" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.610252 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h"] Mar 07 07:37:04 crc kubenswrapper[4738]: E0307 07:37:04.610666 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b840c41c-82bf-4e91-bcd9-fbf75322e299" containerName="swift-ring-rebalance" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.610695 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b840c41c-82bf-4e91-bcd9-fbf75322e299" containerName="swift-ring-rebalance" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.610949 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b840c41c-82bf-4e91-bcd9-fbf75322e299" containerName="swift-ring-rebalance" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.611728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.616289 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.616319 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.621652 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h"] Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.717405 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l86t2\" (UniqueName: \"kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.717972 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.718094 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.718180 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.718221 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.718242 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819511 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819568 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819607 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819634 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819663 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l86t2\" (UniqueName: \"kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.819688 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.820252 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.820447 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.820440 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.823497 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.823534 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.838662 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l86t2\" (UniqueName: \"kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2\") pod \"swift-ring-rebalance-debug-ndb6h\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:04 crc kubenswrapper[4738]: I0307 07:37:04.942402 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:05 crc kubenswrapper[4738]: I0307 07:37:05.413204 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h"] Mar 07 07:37:06 crc kubenswrapper[4738]: I0307 07:37:06.154643 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" event={"ID":"45e67405-8801-4596-80be-1c824ae75b6c","Type":"ContainerStarted","Data":"939e1a1a1a6c45e6bf61b0123fb97be744abe24c28700c8b58ade411cbe7e427"} Mar 07 07:37:06 crc kubenswrapper[4738]: I0307 07:37:06.155060 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" event={"ID":"45e67405-8801-4596-80be-1c824ae75b6c","Type":"ContainerStarted","Data":"c2bae77e69611e1bd056d0f412931b7d40883d68f43ea91cd62d5567478af60f"} Mar 07 07:37:06 crc kubenswrapper[4738]: I0307 07:37:06.172560 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" podStartSLOduration=2.172540794 podStartE2EDuration="2.172540794s" podCreationTimestamp="2026-03-07 07:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:06.169643737 +0000 UTC m=+2244.634631058" watchObservedRunningTime="2026-03-07 07:37:06.172540794 +0000 UTC m=+2244.637528105" Mar 07 07:37:07 crc kubenswrapper[4738]: I0307 07:37:07.165083 4738 generic.go:334] "Generic (PLEG): container finished" podID="45e67405-8801-4596-80be-1c824ae75b6c" containerID="939e1a1a1a6c45e6bf61b0123fb97be744abe24c28700c8b58ade411cbe7e427" exitCode=0 Mar 07 07:37:07 crc kubenswrapper[4738]: I0307 07:37:07.165135 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" event={"ID":"45e67405-8801-4596-80be-1c824ae75b6c","Type":"ContainerDied","Data":"939e1a1a1a6c45e6bf61b0123fb97be744abe24c28700c8b58ade411cbe7e427"} Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.522378 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.564647 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h"] Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.571348 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h"] Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.578476 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l86t2\" (UniqueName: \"kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.578689 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.578844 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.578981 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.579127 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.579344 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices\") pod \"45e67405-8801-4596-80be-1c824ae75b6c\" (UID: \"45e67405-8801-4596-80be-1c824ae75b6c\") " Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.580118 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.580330 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.593348 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2" (OuterVolumeSpecName: "kube-api-access-l86t2") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "kube-api-access-l86t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.600973 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.605156 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.610925 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts" (OuterVolumeSpecName: "scripts") pod "45e67405-8801-4596-80be-1c824ae75b6c" (UID: "45e67405-8801-4596-80be-1c824ae75b6c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681591 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681630 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/45e67405-8801-4596-80be-1c824ae75b6c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681642 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/45e67405-8801-4596-80be-1c824ae75b6c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681655 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681666 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/45e67405-8801-4596-80be-1c824ae75b6c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:08 crc kubenswrapper[4738]: I0307 07:37:08.681679 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l86t2\" (UniqueName: \"kubernetes.io/projected/45e67405-8801-4596-80be-1c824ae75b6c-kube-api-access-l86t2\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.189502 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2bae77e69611e1bd056d0f412931b7d40883d68f43ea91cd62d5567478af60f" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.189554 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ndb6h" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.743464 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hb67"] Mar 07 07:37:09 crc kubenswrapper[4738]: E0307 07:37:09.743781 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45e67405-8801-4596-80be-1c824ae75b6c" containerName="swift-ring-rebalance" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.743795 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="45e67405-8801-4596-80be-1c824ae75b6c" containerName="swift-ring-rebalance" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.743989 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="45e67405-8801-4596-80be-1c824ae75b6c" containerName="swift-ring-rebalance" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.744550 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.748233 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.750701 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.769788 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hb67"] Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.797821 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.798153 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.798317 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjhk7\" (UniqueName: \"kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.798448 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.798558 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.798665 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899702 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899768 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899797 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjhk7\" (UniqueName: \"kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899827 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899852 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.899883 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.900725 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.900778 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.900916 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.905262 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.905354 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:09 crc kubenswrapper[4738]: I0307 07:37:09.921262 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjhk7\" (UniqueName: \"kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7\") pod \"swift-ring-rebalance-debug-2hb67\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:10 crc kubenswrapper[4738]: I0307 07:37:10.115282 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:10 crc kubenswrapper[4738]: I0307 07:37:10.397820 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45e67405-8801-4596-80be-1c824ae75b6c" path="/var/lib/kubelet/pods/45e67405-8801-4596-80be-1c824ae75b6c/volumes" Mar 07 07:37:10 crc kubenswrapper[4738]: I0307 07:37:10.572111 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hb67"] Mar 07 07:37:10 crc kubenswrapper[4738]: W0307 07:37:10.580370 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d4a7725_5c4e_4561_82f4_ecb3000ddba3.slice/crio-60365c0f25a1b46dd4b167741993f85b6390572a37dbaa151bdc7538efc2f10a WatchSource:0}: Error finding container 60365c0f25a1b46dd4b167741993f85b6390572a37dbaa151bdc7538efc2f10a: Status 404 returned error can't find the container with id 60365c0f25a1b46dd4b167741993f85b6390572a37dbaa151bdc7538efc2f10a Mar 07 07:37:11 crc kubenswrapper[4738]: I0307 07:37:11.211487 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" event={"ID":"3d4a7725-5c4e-4561-82f4-ecb3000ddba3","Type":"ContainerStarted","Data":"5d6ad447966589bd7c4379476fa982d1b7d5e58c5c678cf6236fee0c8eb8b5fe"} Mar 07 07:37:11 crc kubenswrapper[4738]: I0307 07:37:11.211966 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" event={"ID":"3d4a7725-5c4e-4561-82f4-ecb3000ddba3","Type":"ContainerStarted","Data":"60365c0f25a1b46dd4b167741993f85b6390572a37dbaa151bdc7538efc2f10a"} Mar 07 07:37:11 crc kubenswrapper[4738]: I0307 07:37:11.241456 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" podStartSLOduration=2.241439157 podStartE2EDuration="2.241439157s" podCreationTimestamp="2026-03-07 07:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:11.237265236 +0000 UTC m=+2249.702252567" watchObservedRunningTime="2026-03-07 07:37:11.241439157 +0000 UTC m=+2249.706426488" Mar 07 07:37:13 crc kubenswrapper[4738]: I0307 07:37:13.238384 4738 generic.go:334] "Generic (PLEG): container finished" podID="3d4a7725-5c4e-4561-82f4-ecb3000ddba3" containerID="5d6ad447966589bd7c4379476fa982d1b7d5e58c5c678cf6236fee0c8eb8b5fe" exitCode=0 Mar 07 07:37:13 crc kubenswrapper[4738]: I0307 07:37:13.238520 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" event={"ID":"3d4a7725-5c4e-4561-82f4-ecb3000ddba3","Type":"ContainerDied","Data":"5d6ad447966589bd7c4379476fa982d1b7d5e58c5c678cf6236fee0c8eb8b5fe"} Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.590605 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.625347 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hb67"] Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.632588 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hb67"] Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677621 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677737 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjhk7\" (UniqueName: \"kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677815 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677885 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677920 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.677958 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices\") pod \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\" (UID: \"3d4a7725-5c4e-4561-82f4-ecb3000ddba3\") " Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.678719 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.684181 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7" (OuterVolumeSpecName: "kube-api-access-bjhk7") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "kube-api-access-bjhk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.686602 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.698709 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.702878 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.705297 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts" (OuterVolumeSpecName: "scripts") pod "3d4a7725-5c4e-4561-82f4-ecb3000ddba3" (UID: "3d4a7725-5c4e-4561-82f4-ecb3000ddba3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780396 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780444 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780464 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780487 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780511 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:14 crc kubenswrapper[4738]: I0307 07:37:14.780558 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjhk7\" (UniqueName: \"kubernetes.io/projected/3d4a7725-5c4e-4561-82f4-ecb3000ddba3-kube-api-access-bjhk7\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.257888 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60365c0f25a1b46dd4b167741993f85b6390572a37dbaa151bdc7538efc2f10a" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.258294 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hb67" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.770250 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g"] Mar 07 07:37:15 crc kubenswrapper[4738]: E0307 07:37:15.770707 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d4a7725-5c4e-4561-82f4-ecb3000ddba3" containerName="swift-ring-rebalance" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.770727 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d4a7725-5c4e-4561-82f4-ecb3000ddba3" containerName="swift-ring-rebalance" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.770953 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d4a7725-5c4e-4561-82f4-ecb3000ddba3" containerName="swift-ring-rebalance" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.771728 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.774618 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.774797 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.783462 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g"] Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895108 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx2p9\" (UniqueName: \"kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895454 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895485 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895521 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895627 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.895768 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997140 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx2p9\" (UniqueName: \"kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997226 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997261 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997291 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997317 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.997814 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.998041 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:15 crc kubenswrapper[4738]: I0307 07:37:15.998300 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.007776 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.013526 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.021740 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx2p9\" (UniqueName: \"kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9\") pod \"swift-ring-rebalance-debug-2hg2g\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.093737 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.287545 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g"] Mar 07 07:37:16 crc kubenswrapper[4738]: I0307 07:37:16.412236 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d4a7725-5c4e-4561-82f4-ecb3000ddba3" path="/var/lib/kubelet/pods/3d4a7725-5c4e-4561-82f4-ecb3000ddba3/volumes" Mar 07 07:37:17 crc kubenswrapper[4738]: I0307 07:37:17.275525 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" event={"ID":"2594bfff-5187-46b1-8a1f-608db42850b1","Type":"ContainerStarted","Data":"d8032b66c77f34fa43304d23968c8c4872fafc507fdede388f6ee8713a5e983d"} Mar 07 07:37:17 crc kubenswrapper[4738]: I0307 07:37:17.275787 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" event={"ID":"2594bfff-5187-46b1-8a1f-608db42850b1","Type":"ContainerStarted","Data":"88c25bde5f1f5c9d60ca2650dfa6f694176b2b30ff6a4fb59a285b033f6cb069"} Mar 07 07:37:17 crc kubenswrapper[4738]: I0307 07:37:17.296947 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" podStartSLOduration=2.296923797 podStartE2EDuration="2.296923797s" podCreationTimestamp="2026-03-07 07:37:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:17.294565384 +0000 UTC m=+2255.759552705" watchObservedRunningTime="2026-03-07 07:37:17.296923797 +0000 UTC m=+2255.761911118" Mar 07 07:37:18 crc kubenswrapper[4738]: I0307 07:37:18.283789 4738 generic.go:334] "Generic (PLEG): container finished" podID="2594bfff-5187-46b1-8a1f-608db42850b1" containerID="d8032b66c77f34fa43304d23968c8c4872fafc507fdede388f6ee8713a5e983d" exitCode=0 Mar 07 07:37:18 crc kubenswrapper[4738]: I0307 07:37:18.283835 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" event={"ID":"2594bfff-5187-46b1-8a1f-608db42850b1","Type":"ContainerDied","Data":"d8032b66c77f34fa43304d23968c8c4872fafc507fdede388f6ee8713a5e983d"} Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.597376 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.624985 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g"] Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.630117 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g"] Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650095 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650224 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650295 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650342 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650436 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx2p9\" (UniqueName: \"kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650470 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf\") pod \"2594bfff-5187-46b1-8a1f-608db42850b1\" (UID: \"2594bfff-5187-46b1-8a1f-608db42850b1\") " Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.650945 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.651051 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.655216 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9" (OuterVolumeSpecName: "kube-api-access-lx2p9") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "kube-api-access-lx2p9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.670363 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts" (OuterVolumeSpecName: "scripts") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.671938 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.682999 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2594bfff-5187-46b1-8a1f-608db42850b1" (UID: "2594bfff-5187-46b1-8a1f-608db42850b1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752307 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx2p9\" (UniqueName: \"kubernetes.io/projected/2594bfff-5187-46b1-8a1f-608db42850b1-kube-api-access-lx2p9\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752552 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752617 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752683 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2594bfff-5187-46b1-8a1f-608db42850b1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752748 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2594bfff-5187-46b1-8a1f-608db42850b1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:19 crc kubenswrapper[4738]: I0307 07:37:19.752802 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2594bfff-5187-46b1-8a1f-608db42850b1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.301931 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="88c25bde5f1f5c9d60ca2650dfa6f694176b2b30ff6a4fb59a285b033f6cb069" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.301970 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2hg2g" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.393443 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2594bfff-5187-46b1-8a1f-608db42850b1" path="/var/lib/kubelet/pods/2594bfff-5187-46b1-8a1f-608db42850b1/volumes" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.800014 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn"] Mar 07 07:37:20 crc kubenswrapper[4738]: E0307 07:37:20.800299 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2594bfff-5187-46b1-8a1f-608db42850b1" containerName="swift-ring-rebalance" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.800312 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2594bfff-5187-46b1-8a1f-608db42850b1" containerName="swift-ring-rebalance" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.800470 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2594bfff-5187-46b1-8a1f-608db42850b1" containerName="swift-ring-rebalance" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.800931 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.802762 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.802791 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.815673 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn"] Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884348 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884423 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfgt\" (UniqueName: \"kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884628 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884705 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884786 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.884845 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986324 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986387 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986424 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfgt\" (UniqueName: \"kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986467 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986491 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.986524 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.987459 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.987667 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.987792 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.989953 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:20 crc kubenswrapper[4738]: I0307 07:37:20.990345 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:21 crc kubenswrapper[4738]: I0307 07:37:21.005809 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfgt\" (UniqueName: \"kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt\") pod \"swift-ring-rebalance-debug-zrkwn\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:21 crc kubenswrapper[4738]: I0307 07:37:21.119588 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:21 crc kubenswrapper[4738]: I0307 07:37:21.531294 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn"] Mar 07 07:37:22 crc kubenswrapper[4738]: I0307 07:37:22.321747 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" event={"ID":"e349853c-0449-45f2-a7cf-7cd3553f2923","Type":"ContainerStarted","Data":"729d3c4fce71dd57fa14a7fb3ea42ec97e47fd51ed82a4eef3c941bddec9d9fa"} Mar 07 07:37:22 crc kubenswrapper[4738]: I0307 07:37:22.322259 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" event={"ID":"e349853c-0449-45f2-a7cf-7cd3553f2923","Type":"ContainerStarted","Data":"dfe935b4d5b6e1f3f3f7f9dd498b2de2115c10227e0a61d728b20df91ec9d12a"} Mar 07 07:37:22 crc kubenswrapper[4738]: I0307 07:37:22.351236 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" podStartSLOduration=2.351216771 podStartE2EDuration="2.351216771s" podCreationTimestamp="2026-03-07 07:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:22.350237785 +0000 UTC m=+2260.815225116" watchObservedRunningTime="2026-03-07 07:37:22.351216771 +0000 UTC m=+2260.816204092" Mar 07 07:37:23 crc kubenswrapper[4738]: I0307 07:37:23.334367 4738 generic.go:334] "Generic (PLEG): container finished" podID="e349853c-0449-45f2-a7cf-7cd3553f2923" containerID="729d3c4fce71dd57fa14a7fb3ea42ec97e47fd51ed82a4eef3c941bddec9d9fa" exitCode=0 Mar 07 07:37:23 crc kubenswrapper[4738]: I0307 07:37:23.334413 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" event={"ID":"e349853c-0449-45f2-a7cf-7cd3553f2923","Type":"ContainerDied","Data":"729d3c4fce71dd57fa14a7fb3ea42ec97e47fd51ed82a4eef3c941bddec9d9fa"} Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.638701 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.684095 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn"] Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.695666 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn"] Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746259 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746342 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746375 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746450 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfgt\" (UniqueName: \"kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746526 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.746580 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift\") pod \"e349853c-0449-45f2-a7cf-7cd3553f2923\" (UID: \"e349853c-0449-45f2-a7cf-7cd3553f2923\") " Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.747453 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.747479 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.751813 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt" (OuterVolumeSpecName: "kube-api-access-jqfgt") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "kube-api-access-jqfgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.767499 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.767615 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts" (OuterVolumeSpecName: "scripts") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.770747 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e349853c-0449-45f2-a7cf-7cd3553f2923" (UID: "e349853c-0449-45f2-a7cf-7cd3553f2923"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848148 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848197 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e349853c-0449-45f2-a7cf-7cd3553f2923-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848208 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848219 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfgt\" (UniqueName: \"kubernetes.io/projected/e349853c-0449-45f2-a7cf-7cd3553f2923-kube-api-access-jqfgt\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848231 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e349853c-0449-45f2-a7cf-7cd3553f2923-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:24 crc kubenswrapper[4738]: I0307 07:37:24.848240 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e349853c-0449-45f2-a7cf-7cd3553f2923-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.354360 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfe935b4d5b6e1f3f3f7f9dd498b2de2115c10227e0a61d728b20df91ec9d12a" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.354695 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrkwn" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.820216 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw"] Mar 07 07:37:25 crc kubenswrapper[4738]: E0307 07:37:25.821025 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e349853c-0449-45f2-a7cf-7cd3553f2923" containerName="swift-ring-rebalance" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.821041 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e349853c-0449-45f2-a7cf-7cd3553f2923" containerName="swift-ring-rebalance" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.821288 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e349853c-0449-45f2-a7cf-7cd3553f2923" containerName="swift-ring-rebalance" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.821935 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.829117 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw"] Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.832921 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.833049 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.864896 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.864961 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.865013 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lpf4\" (UniqueName: \"kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.865069 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.865101 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.865228 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.966869 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.966922 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.966984 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lpf4\" (UniqueName: \"kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.967040 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.967063 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.967102 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.967772 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.967885 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.968310 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.971019 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.971215 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:25 crc kubenswrapper[4738]: I0307 07:37:25.990134 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lpf4\" (UniqueName: \"kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4\") pod \"swift-ring-rebalance-debug-qzgjw\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:26 crc kubenswrapper[4738]: I0307 07:37:26.145245 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:26 crc kubenswrapper[4738]: I0307 07:37:26.394576 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e349853c-0449-45f2-a7cf-7cd3553f2923" path="/var/lib/kubelet/pods/e349853c-0449-45f2-a7cf-7cd3553f2923/volumes" Mar 07 07:37:26 crc kubenswrapper[4738]: I0307 07:37:26.485790 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw"] Mar 07 07:37:26 crc kubenswrapper[4738]: W0307 07:37:26.487985 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b629941_9aa2_4e20_b100_87e0dd4b001e.slice/crio-9b3a0c32f89d2efa7ae814a7b365d5c4468c34c0d9406522e1c106c91f2b8b39 WatchSource:0}: Error finding container 9b3a0c32f89d2efa7ae814a7b365d5c4468c34c0d9406522e1c106c91f2b8b39: Status 404 returned error can't find the container with id 9b3a0c32f89d2efa7ae814a7b365d5c4468c34c0d9406522e1c106c91f2b8b39 Mar 07 07:37:26 crc kubenswrapper[4738]: I0307 07:37:26.957951 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:37:26 crc kubenswrapper[4738]: I0307 07:37:26.959459 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:37:27 crc kubenswrapper[4738]: I0307 07:37:27.388824 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" event={"ID":"6b629941-9aa2-4e20-b100-87e0dd4b001e","Type":"ContainerStarted","Data":"ebdcf26fc133c3d86316c328e36a4cbede2630294e9c51935057679e1af8392c"} Mar 07 07:37:27 crc kubenswrapper[4738]: I0307 07:37:27.388874 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" event={"ID":"6b629941-9aa2-4e20-b100-87e0dd4b001e","Type":"ContainerStarted","Data":"9b3a0c32f89d2efa7ae814a7b365d5c4468c34c0d9406522e1c106c91f2b8b39"} Mar 07 07:37:27 crc kubenswrapper[4738]: I0307 07:37:27.423435 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" podStartSLOduration=2.423411303 podStartE2EDuration="2.423411303s" podCreationTimestamp="2026-03-07 07:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:27.422482678 +0000 UTC m=+2265.887469999" watchObservedRunningTime="2026-03-07 07:37:27.423411303 +0000 UTC m=+2265.888398664" Mar 07 07:37:28 crc kubenswrapper[4738]: I0307 07:37:28.397792 4738 generic.go:334] "Generic (PLEG): container finished" podID="6b629941-9aa2-4e20-b100-87e0dd4b001e" containerID="ebdcf26fc133c3d86316c328e36a4cbede2630294e9c51935057679e1af8392c" exitCode=0 Mar 07 07:37:28 crc kubenswrapper[4738]: I0307 07:37:28.402563 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" event={"ID":"6b629941-9aa2-4e20-b100-87e0dd4b001e","Type":"ContainerDied","Data":"ebdcf26fc133c3d86316c328e36a4cbede2630294e9c51935057679e1af8392c"} Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.708539 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.773214 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw"] Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.779919 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw"] Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824019 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824115 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lpf4\" (UniqueName: \"kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824191 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824282 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824361 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.824555 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices\") pod \"6b629941-9aa2-4e20-b100-87e0dd4b001e\" (UID: \"6b629941-9aa2-4e20-b100-87e0dd4b001e\") " Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.825209 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.825836 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.831435 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4" (OuterVolumeSpecName: "kube-api-access-6lpf4") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "kube-api-access-6lpf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.846704 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts" (OuterVolumeSpecName: "scripts") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.847853 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.854795 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "6b629941-9aa2-4e20-b100-87e0dd4b001e" (UID: "6b629941-9aa2-4e20-b100-87e0dd4b001e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926744 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926778 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926790 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lpf4\" (UniqueName: \"kubernetes.io/projected/6b629941-9aa2-4e20-b100-87e0dd4b001e-kube-api-access-6lpf4\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926800 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/6b629941-9aa2-4e20-b100-87e0dd4b001e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926809 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6b629941-9aa2-4e20-b100-87e0dd4b001e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:29 crc kubenswrapper[4738]: I0307 07:37:29.926817 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/6b629941-9aa2-4e20-b100-87e0dd4b001e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.395386 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b629941-9aa2-4e20-b100-87e0dd4b001e" path="/var/lib/kubelet/pods/6b629941-9aa2-4e20-b100-87e0dd4b001e/volumes" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.415523 4738 scope.go:117] "RemoveContainer" containerID="ebdcf26fc133c3d86316c328e36a4cbede2630294e9c51935057679e1af8392c" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.416042 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-qzgjw" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.927275 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wshvm"] Mar 07 07:37:30 crc kubenswrapper[4738]: E0307 07:37:30.927933 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b629941-9aa2-4e20-b100-87e0dd4b001e" containerName="swift-ring-rebalance" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.927950 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b629941-9aa2-4e20-b100-87e0dd4b001e" containerName="swift-ring-rebalance" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.928332 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b629941-9aa2-4e20-b100-87e0dd4b001e" containerName="swift-ring-rebalance" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.929247 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.937034 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.937606 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945278 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945325 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945440 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945505 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945558 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.945619 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrz2q\" (UniqueName: \"kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:30 crc kubenswrapper[4738]: I0307 07:37:30.968642 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wshvm"] Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057269 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057676 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057732 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057771 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057821 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.057880 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrz2q\" (UniqueName: \"kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.058380 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.058804 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.059153 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.063963 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.071700 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.075571 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrz2q\" (UniqueName: \"kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q\") pod \"swift-ring-rebalance-debug-wshvm\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.262446 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:31 crc kubenswrapper[4738]: I0307 07:37:31.696945 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wshvm"] Mar 07 07:37:32 crc kubenswrapper[4738]: I0307 07:37:32.473802 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" event={"ID":"102aad7c-52d8-453d-ac2c-e3d552b91143","Type":"ContainerStarted","Data":"9ccf16f1c9927b36bda9e666d4367ef544cafd2b34b5d9b7be8c3a1fa4c90b4d"} Mar 07 07:37:32 crc kubenswrapper[4738]: I0307 07:37:32.474079 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" event={"ID":"102aad7c-52d8-453d-ac2c-e3d552b91143","Type":"ContainerStarted","Data":"6b91191ca0bc54cf94e1cc87101fed38213b4cf5962f6b305fa6e2fc888fe482"} Mar 07 07:37:32 crc kubenswrapper[4738]: I0307 07:37:32.500058 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" podStartSLOduration=2.500042883 podStartE2EDuration="2.500042883s" podCreationTimestamp="2026-03-07 07:37:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:32.495578134 +0000 UTC m=+2270.960565455" watchObservedRunningTime="2026-03-07 07:37:32.500042883 +0000 UTC m=+2270.965030204" Mar 07 07:37:33 crc kubenswrapper[4738]: I0307 07:37:33.482084 4738 generic.go:334] "Generic (PLEG): container finished" podID="102aad7c-52d8-453d-ac2c-e3d552b91143" containerID="9ccf16f1c9927b36bda9e666d4367ef544cafd2b34b5d9b7be8c3a1fa4c90b4d" exitCode=0 Mar 07 07:37:33 crc kubenswrapper[4738]: I0307 07:37:33.482204 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" event={"ID":"102aad7c-52d8-453d-ac2c-e3d552b91143","Type":"ContainerDied","Data":"9ccf16f1c9927b36bda9e666d4367ef544cafd2b34b5d9b7be8c3a1fa4c90b4d"} Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.804543 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.840080 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wshvm"] Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.847508 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wshvm"] Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914517 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914590 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914662 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914684 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrz2q\" (UniqueName: \"kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914740 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.914781 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices\") pod \"102aad7c-52d8-453d-ac2c-e3d552b91143\" (UID: \"102aad7c-52d8-453d-ac2c-e3d552b91143\") " Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.915484 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.917221 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.921093 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q" (OuterVolumeSpecName: "kube-api-access-rrz2q") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "kube-api-access-rrz2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.936303 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts" (OuterVolumeSpecName: "scripts") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.941404 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:34 crc kubenswrapper[4738]: I0307 07:37:34.946202 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "102aad7c-52d8-453d-ac2c-e3d552b91143" (UID: "102aad7c-52d8-453d-ac2c-e3d552b91143"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018434 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrz2q\" (UniqueName: \"kubernetes.io/projected/102aad7c-52d8-453d-ac2c-e3d552b91143-kube-api-access-rrz2q\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018502 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018523 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/102aad7c-52d8-453d-ac2c-e3d552b91143-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018537 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018555 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/102aad7c-52d8-453d-ac2c-e3d552b91143-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.018567 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/102aad7c-52d8-453d-ac2c-e3d552b91143-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.501398 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b91191ca0bc54cf94e1cc87101fed38213b4cf5962f6b305fa6e2fc888fe482" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.501497 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wshvm" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.996333 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq"] Mar 07 07:37:35 crc kubenswrapper[4738]: E0307 07:37:35.996864 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="102aad7c-52d8-453d-ac2c-e3d552b91143" containerName="swift-ring-rebalance" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.996877 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="102aad7c-52d8-453d-ac2c-e3d552b91143" containerName="swift-ring-rebalance" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.997021 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="102aad7c-52d8-453d-ac2c-e3d552b91143" containerName="swift-ring-rebalance" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.997499 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.999744 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:35 crc kubenswrapper[4738]: I0307 07:37:35.999899 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.010116 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq"] Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135709 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135760 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135791 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135898 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j96tp\" (UniqueName: \"kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135927 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.135963 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.237935 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.237997 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.238053 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j96tp\" (UniqueName: \"kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.238083 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.238193 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.238284 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.239024 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.239062 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.239182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.248871 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.248889 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.254136 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j96tp\" (UniqueName: \"kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp\") pod \"swift-ring-rebalance-debug-2pzcq\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.339884 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.396568 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="102aad7c-52d8-453d-ac2c-e3d552b91143" path="/var/lib/kubelet/pods/102aad7c-52d8-453d-ac2c-e3d552b91143/volumes" Mar 07 07:37:36 crc kubenswrapper[4738]: I0307 07:37:36.792738 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq"] Mar 07 07:37:36 crc kubenswrapper[4738]: W0307 07:37:36.800378 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14a403c8_af8b_48da_8298_c97a1ca07ea6.slice/crio-493c2096e27272315bbfd75d827f46e14b77513144ec88047254ae4029cc9835 WatchSource:0}: Error finding container 493c2096e27272315bbfd75d827f46e14b77513144ec88047254ae4029cc9835: Status 404 returned error can't find the container with id 493c2096e27272315bbfd75d827f46e14b77513144ec88047254ae4029cc9835 Mar 07 07:37:37 crc kubenswrapper[4738]: I0307 07:37:37.524989 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" event={"ID":"14a403c8-af8b-48da-8298-c97a1ca07ea6","Type":"ContainerStarted","Data":"e37f9234981bfde2899aa7260a53f8e680c97dbc6043b49778ce90760685d67e"} Mar 07 07:37:37 crc kubenswrapper[4738]: I0307 07:37:37.526090 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" event={"ID":"14a403c8-af8b-48da-8298-c97a1ca07ea6","Type":"ContainerStarted","Data":"493c2096e27272315bbfd75d827f46e14b77513144ec88047254ae4029cc9835"} Mar 07 07:37:37 crc kubenswrapper[4738]: I0307 07:37:37.545490 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" podStartSLOduration=2.54542876 podStartE2EDuration="2.54542876s" podCreationTimestamp="2026-03-07 07:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:37.543384465 +0000 UTC m=+2276.008371826" watchObservedRunningTime="2026-03-07 07:37:37.54542876 +0000 UTC m=+2276.010416081" Mar 07 07:37:38 crc kubenswrapper[4738]: I0307 07:37:38.536688 4738 generic.go:334] "Generic (PLEG): container finished" podID="14a403c8-af8b-48da-8298-c97a1ca07ea6" containerID="e37f9234981bfde2899aa7260a53f8e680c97dbc6043b49778ce90760685d67e" exitCode=0 Mar 07 07:37:38 crc kubenswrapper[4738]: I0307 07:37:38.537029 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" event={"ID":"14a403c8-af8b-48da-8298-c97a1ca07ea6","Type":"ContainerDied","Data":"e37f9234981bfde2899aa7260a53f8e680c97dbc6043b49778ce90760685d67e"} Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.851449 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.893366 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq"] Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.900133 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq"] Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999089 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999278 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999312 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999388 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j96tp\" (UniqueName: \"kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999420 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:39 crc kubenswrapper[4738]: I0307 07:37:39.999481 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices\") pod \"14a403c8-af8b-48da-8298-c97a1ca07ea6\" (UID: \"14a403c8-af8b-48da-8298-c97a1ca07ea6\") " Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.000058 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.000266 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.012350 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp" (OuterVolumeSpecName: "kube-api-access-j96tp") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "kube-api-access-j96tp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.018386 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts" (OuterVolumeSpecName: "scripts") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.024669 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.030290 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "14a403c8-af8b-48da-8298-c97a1ca07ea6" (UID: "14a403c8-af8b-48da-8298-c97a1ca07ea6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101305 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j96tp\" (UniqueName: \"kubernetes.io/projected/14a403c8-af8b-48da-8298-c97a1ca07ea6-kube-api-access-j96tp\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101349 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101364 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101376 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/14a403c8-af8b-48da-8298-c97a1ca07ea6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101387 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/14a403c8-af8b-48da-8298-c97a1ca07ea6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.101398 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/14a403c8-af8b-48da-8298-c97a1ca07ea6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.401346 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14a403c8-af8b-48da-8298-c97a1ca07ea6" path="/var/lib/kubelet/pods/14a403c8-af8b-48da-8298-c97a1ca07ea6/volumes" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.558405 4738 scope.go:117] "RemoveContainer" containerID="e37f9234981bfde2899aa7260a53f8e680c97dbc6043b49778ce90760685d67e" Mar 07 07:37:40 crc kubenswrapper[4738]: I0307 07:37:40.558455 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-2pzcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.102380 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq"] Mar 07 07:37:41 crc kubenswrapper[4738]: E0307 07:37:41.103082 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14a403c8-af8b-48da-8298-c97a1ca07ea6" containerName="swift-ring-rebalance" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.103099 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="14a403c8-af8b-48da-8298-c97a1ca07ea6" containerName="swift-ring-rebalance" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.103319 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="14a403c8-af8b-48da-8298-c97a1ca07ea6" containerName="swift-ring-rebalance" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.103922 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.106239 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.106392 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.121940 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq"] Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216426 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216570 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqkrw\" (UniqueName: \"kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216653 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216748 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216800 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.216993 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.318857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqkrw\" (UniqueName: \"kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.318980 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.319054 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.319135 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.319220 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.319269 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.320940 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.322131 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.322821 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.324313 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.326023 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.351067 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.352940 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.367132 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.370205 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqkrw\" (UniqueName: \"kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw\") pod \"swift-ring-rebalance-debug-hrvcq\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.436577 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.522672 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd2pb\" (UniqueName: \"kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.522793 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.522837 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.624286 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd2pb\" (UniqueName: \"kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.624650 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.624681 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.625141 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.625420 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.648540 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd2pb\" (UniqueName: \"kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb\") pod \"certified-operators-pfb24\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.738070 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:41 crc kubenswrapper[4738]: I0307 07:37:41.898550 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq"] Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.234354 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.579422 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" event={"ID":"bc64f703-ee62-474a-906c-23daf34fedba","Type":"ContainerStarted","Data":"9f950ef790d657780bf45e4e9a8892c41c1ad1ba0acc788dced915075bf9072d"} Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.579458 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" event={"ID":"bc64f703-ee62-474a-906c-23daf34fedba","Type":"ContainerStarted","Data":"d05dc45555350540aa52fa745060f895bc5c7f740614cfd0c6e597eb5f845a49"} Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.581428 4738 generic.go:334] "Generic (PLEG): container finished" podID="ace31251-4f07-4b03-9f69-aa69d0712538" containerID="3fce06829f8dca65734140a87dcf475086cc0694758067b0a74a49ae1d883d6f" exitCode=0 Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.581460 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerDied","Data":"3fce06829f8dca65734140a87dcf475086cc0694758067b0a74a49ae1d883d6f"} Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.581478 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerStarted","Data":"bf7a70858545ee7accf8f003ad0fecb287c13bc233663767c6252525d79e3ed8"} Mar 07 07:37:42 crc kubenswrapper[4738]: I0307 07:37:42.598602 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" podStartSLOduration=1.5985877130000001 podStartE2EDuration="1.598587713s" podCreationTimestamp="2026-03-07 07:37:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:42.596598179 +0000 UTC m=+2281.061585500" watchObservedRunningTime="2026-03-07 07:37:42.598587713 +0000 UTC m=+2281.063575034" Mar 07 07:37:43 crc kubenswrapper[4738]: I0307 07:37:43.591311 4738 generic.go:334] "Generic (PLEG): container finished" podID="bc64f703-ee62-474a-906c-23daf34fedba" containerID="9f950ef790d657780bf45e4e9a8892c41c1ad1ba0acc788dced915075bf9072d" exitCode=0 Mar 07 07:37:43 crc kubenswrapper[4738]: I0307 07:37:43.591360 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" event={"ID":"bc64f703-ee62-474a-906c-23daf34fedba","Type":"ContainerDied","Data":"9f950ef790d657780bf45e4e9a8892c41c1ad1ba0acc788dced915075bf9072d"} Mar 07 07:37:44 crc kubenswrapper[4738]: I0307 07:37:44.601068 4738 generic.go:334] "Generic (PLEG): container finished" podID="ace31251-4f07-4b03-9f69-aa69d0712538" containerID="d4cd2abe1a32a13845d9a3bf8b359f24095f3f8295154289ae3910ae389c876e" exitCode=0 Mar 07 07:37:44 crc kubenswrapper[4738]: I0307 07:37:44.601100 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerDied","Data":"d4cd2abe1a32a13845d9a3bf8b359f24095f3f8295154289ae3910ae389c876e"} Mar 07 07:37:44 crc kubenswrapper[4738]: I0307 07:37:44.911701 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:44 crc kubenswrapper[4738]: I0307 07:37:44.946622 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq"] Mar 07 07:37:44 crc kubenswrapper[4738]: I0307 07:37:44.952041 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq"] Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079341 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079404 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079466 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079575 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqkrw\" (UniqueName: \"kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.079646 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift\") pod \"bc64f703-ee62-474a-906c-23daf34fedba\" (UID: \"bc64f703-ee62-474a-906c-23daf34fedba\") " Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.080468 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.080853 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.091641 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw" (OuterVolumeSpecName: "kube-api-access-pqkrw") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "kube-api-access-pqkrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.099044 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts" (OuterVolumeSpecName: "scripts") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.108389 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.108659 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bc64f703-ee62-474a-906c-23daf34fedba" (UID: "bc64f703-ee62-474a-906c-23daf34fedba"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181632 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqkrw\" (UniqueName: \"kubernetes.io/projected/bc64f703-ee62-474a-906c-23daf34fedba-kube-api-access-pqkrw\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181679 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bc64f703-ee62-474a-906c-23daf34fedba-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181697 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181714 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bc64f703-ee62-474a-906c-23daf34fedba-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181732 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.181748 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bc64f703-ee62-474a-906c-23daf34fedba-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.615271 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerStarted","Data":"d0e7eeea5f737fe2f6e4ee7a23837185c956d7ba49124855eee7b50381e7f48d"} Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.616884 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d05dc45555350540aa52fa745060f895bc5c7f740614cfd0c6e597eb5f845a49" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.616955 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hrvcq" Mar 07 07:37:45 crc kubenswrapper[4738]: I0307 07:37:45.637522 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pfb24" podStartSLOduration=2.200329436 podStartE2EDuration="4.63750448s" podCreationTimestamp="2026-03-07 07:37:41 +0000 UTC" firstStartedPulling="2026-03-07 07:37:42.583216642 +0000 UTC m=+2281.048203963" lastFinishedPulling="2026-03-07 07:37:45.020391686 +0000 UTC m=+2283.485379007" observedRunningTime="2026-03-07 07:37:45.632148738 +0000 UTC m=+2284.097136059" watchObservedRunningTime="2026-03-07 07:37:45.63750448 +0000 UTC m=+2284.102491801" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.085702 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5thj"] Mar 07 07:37:46 crc kubenswrapper[4738]: E0307 07:37:46.086426 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc64f703-ee62-474a-906c-23daf34fedba" containerName="swift-ring-rebalance" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.086451 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc64f703-ee62-474a-906c-23daf34fedba" containerName="swift-ring-rebalance" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.086581 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc64f703-ee62-474a-906c-23daf34fedba" containerName="swift-ring-rebalance" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.087344 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.089248 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.089272 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.097759 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5thj"] Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198465 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198518 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198542 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198621 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198663 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.198682 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99j85\" (UniqueName: \"kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300373 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300428 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99j85\" (UniqueName: \"kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300484 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300517 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.300538 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.301012 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.301581 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.301641 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.308368 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.309089 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.318456 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99j85\" (UniqueName: \"kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85\") pod \"swift-ring-rebalance-debug-c5thj\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.394870 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc64f703-ee62-474a-906c-23daf34fedba" path="/var/lib/kubelet/pods/bc64f703-ee62-474a-906c-23daf34fedba/volumes" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.402260 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.445149 4738 scope.go:117] "RemoveContainer" containerID="492007510e067b8cb79fae26b464af1b5d45fe64ec7f02f6ea32c00ad8edafe9" Mar 07 07:37:46 crc kubenswrapper[4738]: I0307 07:37:46.836563 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5thj"] Mar 07 07:37:46 crc kubenswrapper[4738]: W0307 07:37:46.845980 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd12b6e2f_560e_4f19_8f52_b4391a67b889.slice/crio-6f96064b4ca9e09b741690255df2698a21758a78e406ea81f8331a60fdaa82a8 WatchSource:0}: Error finding container 6f96064b4ca9e09b741690255df2698a21758a78e406ea81f8331a60fdaa82a8: Status 404 returned error can't find the container with id 6f96064b4ca9e09b741690255df2698a21758a78e406ea81f8331a60fdaa82a8 Mar 07 07:37:47 crc kubenswrapper[4738]: I0307 07:37:47.651467 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" event={"ID":"d12b6e2f-560e-4f19-8f52-b4391a67b889","Type":"ContainerStarted","Data":"7deb035d31d7a43e8d79b01b720bc38b344d7b70c2684a1ddf0a5ff7ca021b42"} Mar 07 07:37:47 crc kubenswrapper[4738]: I0307 07:37:47.652105 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" event={"ID":"d12b6e2f-560e-4f19-8f52-b4391a67b889","Type":"ContainerStarted","Data":"6f96064b4ca9e09b741690255df2698a21758a78e406ea81f8331a60fdaa82a8"} Mar 07 07:37:47 crc kubenswrapper[4738]: I0307 07:37:47.685755 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" podStartSLOduration=1.6857345430000001 podStartE2EDuration="1.685734543s" podCreationTimestamp="2026-03-07 07:37:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:47.672991773 +0000 UTC m=+2286.137979104" watchObservedRunningTime="2026-03-07 07:37:47.685734543 +0000 UTC m=+2286.150721884" Mar 07 07:37:48 crc kubenswrapper[4738]: I0307 07:37:48.665425 4738 generic.go:334] "Generic (PLEG): container finished" podID="d12b6e2f-560e-4f19-8f52-b4391a67b889" containerID="7deb035d31d7a43e8d79b01b720bc38b344d7b70c2684a1ddf0a5ff7ca021b42" exitCode=0 Mar 07 07:37:48 crc kubenswrapper[4738]: I0307 07:37:48.665551 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" event={"ID":"d12b6e2f-560e-4f19-8f52-b4391a67b889","Type":"ContainerDied","Data":"7deb035d31d7a43e8d79b01b720bc38b344d7b70c2684a1ddf0a5ff7ca021b42"} Mar 07 07:37:49 crc kubenswrapper[4738]: I0307 07:37:49.994121 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.025222 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5thj"] Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.029722 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-c5thj"] Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.159269 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.159405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99j85\" (UniqueName: \"kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.159445 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.159509 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.159544 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.160240 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf\") pod \"d12b6e2f-560e-4f19-8f52-b4391a67b889\" (UID: \"d12b6e2f-560e-4f19-8f52-b4391a67b889\") " Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.160370 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.160568 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.160620 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.165103 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85" (OuterVolumeSpecName: "kube-api-access-99j85") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "kube-api-access-99j85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.180387 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.181980 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts" (OuterVolumeSpecName: "scripts") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.189393 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d12b6e2f-560e-4f19-8f52-b4391a67b889" (UID: "d12b6e2f-560e-4f19-8f52-b4391a67b889"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.261981 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99j85\" (UniqueName: \"kubernetes.io/projected/d12b6e2f-560e-4f19-8f52-b4391a67b889-kube-api-access-99j85\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.262331 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d12b6e2f-560e-4f19-8f52-b4391a67b889-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.262349 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.262361 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d12b6e2f-560e-4f19-8f52-b4391a67b889-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.262374 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d12b6e2f-560e-4f19-8f52-b4391a67b889-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.401366 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12b6e2f-560e-4f19-8f52-b4391a67b889" path="/var/lib/kubelet/pods/d12b6e2f-560e-4f19-8f52-b4391a67b889/volumes" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.687188 4738 scope.go:117] "RemoveContainer" containerID="7deb035d31d7a43e8d79b01b720bc38b344d7b70c2684a1ddf0a5ff7ca021b42" Mar 07 07:37:50 crc kubenswrapper[4738]: I0307 07:37:50.687250 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-c5thj" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.223837 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z"] Mar 07 07:37:51 crc kubenswrapper[4738]: E0307 07:37:51.224292 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12b6e2f-560e-4f19-8f52-b4391a67b889" containerName="swift-ring-rebalance" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.224313 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12b6e2f-560e-4f19-8f52-b4391a67b889" containerName="swift-ring-rebalance" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.224560 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12b6e2f-560e-4f19-8f52-b4391a67b889" containerName="swift-ring-rebalance" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.225217 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.227606 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.227616 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.241153 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z"] Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381343 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381410 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381617 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381704 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjxv\" (UniqueName: \"kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381854 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.381893 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.483657 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.483707 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.483754 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.483788 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjxv\" (UniqueName: \"kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.484527 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.484558 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.484652 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.485311 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.485459 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.488182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.488182 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.505328 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjxv\" (UniqueName: \"kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv\") pod \"swift-ring-rebalance-debug-xgq8z\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.548826 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.738840 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.739139 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.783501 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:51 crc kubenswrapper[4738]: I0307 07:37:51.954849 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z"] Mar 07 07:37:52 crc kubenswrapper[4738]: I0307 07:37:52.728458 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" event={"ID":"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e","Type":"ContainerStarted","Data":"2db3f6488469b75ebc303eb6dd98c372bb59c32f16936046fbc8e6066e3203f1"} Mar 07 07:37:52 crc kubenswrapper[4738]: I0307 07:37:52.728878 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" event={"ID":"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e","Type":"ContainerStarted","Data":"e1622ed961fd05e4e1d6fe686fa9292a560b5ed743d610523f76ba565ebf81d0"} Mar 07 07:37:52 crc kubenswrapper[4738]: I0307 07:37:52.759490 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" podStartSLOduration=1.7594666650000002 podStartE2EDuration="1.759466665s" podCreationTimestamp="2026-03-07 07:37:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:52.755661363 +0000 UTC m=+2291.220648714" watchObservedRunningTime="2026-03-07 07:37:52.759466665 +0000 UTC m=+2291.224454006" Mar 07 07:37:52 crc kubenswrapper[4738]: I0307 07:37:52.778006 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:52 crc kubenswrapper[4738]: I0307 07:37:52.824218 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:53 crc kubenswrapper[4738]: I0307 07:37:53.739979 4738 generic.go:334] "Generic (PLEG): container finished" podID="e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" containerID="2db3f6488469b75ebc303eb6dd98c372bb59c32f16936046fbc8e6066e3203f1" exitCode=0 Mar 07 07:37:53 crc kubenswrapper[4738]: I0307 07:37:53.740063 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" event={"ID":"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e","Type":"ContainerDied","Data":"2db3f6488469b75ebc303eb6dd98c372bb59c32f16936046fbc8e6066e3203f1"} Mar 07 07:37:54 crc kubenswrapper[4738]: I0307 07:37:54.747765 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pfb24" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="registry-server" containerID="cri-o://d0e7eeea5f737fe2f6e4ee7a23837185c956d7ba49124855eee7b50381e7f48d" gracePeriod=2 Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:54.999984 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.034616 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z"] Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.037546 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z"] Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.161872 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.161989 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.162058 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.162093 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.162274 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.162311 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qjxv\" (UniqueName: \"kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv\") pod \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\" (UID: \"e6fa9a63-eb94-48a7-8513-6ac0b43ad49e\") " Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.163002 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.163781 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.167560 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv" (OuterVolumeSpecName: "kube-api-access-8qjxv") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "kube-api-access-8qjxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.190105 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts" (OuterVolumeSpecName: "scripts") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.191527 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.191711 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" (UID: "e6fa9a63-eb94-48a7-8513-6ac0b43ad49e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264323 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264379 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qjxv\" (UniqueName: \"kubernetes.io/projected/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-kube-api-access-8qjxv\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264392 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264401 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264411 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.264421 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.758704 4738 generic.go:334] "Generic (PLEG): container finished" podID="ace31251-4f07-4b03-9f69-aa69d0712538" containerID="d0e7eeea5f737fe2f6e4ee7a23837185c956d7ba49124855eee7b50381e7f48d" exitCode=0 Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.758810 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerDied","Data":"d0e7eeea5f737fe2f6e4ee7a23837185c956d7ba49124855eee7b50381e7f48d"} Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.761402 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1622ed961fd05e4e1d6fe686fa9292a560b5ed743d610523f76ba565ebf81d0" Mar 07 07:37:55 crc kubenswrapper[4738]: I0307 07:37:55.761473 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-xgq8z" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.190657 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d"] Mar 07 07:37:56 crc kubenswrapper[4738]: E0307 07:37:56.191025 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" containerName="swift-ring-rebalance" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.191044 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" containerName="swift-ring-rebalance" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.191241 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" containerName="swift-ring-rebalance" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.191841 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.194294 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.194540 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.216055 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d"] Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.281974 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.282037 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.282068 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.282104 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.282182 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.282260 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlb2g\" (UniqueName: \"kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.383884 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.383943 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.383975 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.384008 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.384063 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.384106 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlb2g\" (UniqueName: \"kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.384486 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.385074 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.385187 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.393710 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.394943 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.410915 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6fa9a63-eb94-48a7-8513-6ac0b43ad49e" path="/var/lib/kubelet/pods/e6fa9a63-eb94-48a7-8513-6ac0b43ad49e/volumes" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.415225 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlb2g\" (UniqueName: \"kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g\") pod \"swift-ring-rebalance-debug-h8j4d\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.507057 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.531793 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.689863 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities\") pod \"ace31251-4f07-4b03-9f69-aa69d0712538\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.689997 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content\") pod \"ace31251-4f07-4b03-9f69-aa69d0712538\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.690025 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nd2pb\" (UniqueName: \"kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb\") pod \"ace31251-4f07-4b03-9f69-aa69d0712538\" (UID: \"ace31251-4f07-4b03-9f69-aa69d0712538\") " Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.691609 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities" (OuterVolumeSpecName: "utilities") pod "ace31251-4f07-4b03-9f69-aa69d0712538" (UID: "ace31251-4f07-4b03-9f69-aa69d0712538"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.695631 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb" (OuterVolumeSpecName: "kube-api-access-nd2pb") pod "ace31251-4f07-4b03-9f69-aa69d0712538" (UID: "ace31251-4f07-4b03-9f69-aa69d0712538"). InnerVolumeSpecName "kube-api-access-nd2pb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.759012 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ace31251-4f07-4b03-9f69-aa69d0712538" (UID: "ace31251-4f07-4b03-9f69-aa69d0712538"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.771201 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pfb24" event={"ID":"ace31251-4f07-4b03-9f69-aa69d0712538","Type":"ContainerDied","Data":"bf7a70858545ee7accf8f003ad0fecb287c13bc233663767c6252525d79e3ed8"} Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.771255 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pfb24" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.771289 4738 scope.go:117] "RemoveContainer" containerID="d0e7eeea5f737fe2f6e4ee7a23837185c956d7ba49124855eee7b50381e7f48d" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.791403 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.791427 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ace31251-4f07-4b03-9f69-aa69d0712538-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.791436 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nd2pb\" (UniqueName: \"kubernetes.io/projected/ace31251-4f07-4b03-9f69-aa69d0712538-kube-api-access-nd2pb\") on node \"crc\" DevicePath \"\"" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.794000 4738 scope.go:117] "RemoveContainer" containerID="d4cd2abe1a32a13845d9a3bf8b359f24095f3f8295154289ae3910ae389c876e" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.815651 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.823651 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pfb24"] Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.830540 4738 scope.go:117] "RemoveContainer" containerID="3fce06829f8dca65734140a87dcf475086cc0694758067b0a74a49ae1d883d6f" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.941022 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d"] Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.957339 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.957403 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.957447 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.958143 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:37:56 crc kubenswrapper[4738]: I0307 07:37:56.958233 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff" gracePeriod=600 Mar 07 07:37:57 crc kubenswrapper[4738]: I0307 07:37:57.784724 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff" exitCode=0 Mar 07 07:37:57 crc kubenswrapper[4738]: I0307 07:37:57.784841 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff"} Mar 07 07:37:57 crc kubenswrapper[4738]: I0307 07:37:57.785330 4738 scope.go:117] "RemoveContainer" containerID="46bd59578073fd7aa4f8d5fc68ca89b0199231468fb67909b75425891f069f15" Mar 07 07:37:57 crc kubenswrapper[4738]: I0307 07:37:57.788898 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" event={"ID":"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9","Type":"ContainerStarted","Data":"ec7ddc3258932f444e8277c7e4a4a42cca50fa0ba08fa4c6e6cfbaa278ef8890"} Mar 07 07:37:57 crc kubenswrapper[4738]: I0307 07:37:57.788965 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" event={"ID":"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9","Type":"ContainerStarted","Data":"76b1ac6c1c60396829a5b33380bbc87b9eadfe770541857736f5b387b64ce489"} Mar 07 07:37:58 crc kubenswrapper[4738]: I0307 07:37:58.403457 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" path="/var/lib/kubelet/pods/ace31251-4f07-4b03-9f69-aa69d0712538/volumes" Mar 07 07:37:58 crc kubenswrapper[4738]: I0307 07:37:58.802039 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee"} Mar 07 07:37:58 crc kubenswrapper[4738]: I0307 07:37:58.804507 4738 generic.go:334] "Generic (PLEG): container finished" podID="9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" containerID="ec7ddc3258932f444e8277c7e4a4a42cca50fa0ba08fa4c6e6cfbaa278ef8890" exitCode=0 Mar 07 07:37:58 crc kubenswrapper[4738]: I0307 07:37:58.804553 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" event={"ID":"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9","Type":"ContainerDied","Data":"ec7ddc3258932f444e8277c7e4a4a42cca50fa0ba08fa4c6e6cfbaa278ef8890"} Mar 07 07:37:58 crc kubenswrapper[4738]: I0307 07:37:58.823185 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" podStartSLOduration=2.823150612 podStartE2EDuration="2.823150612s" podCreationTimestamp="2026-03-07 07:37:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:37:57.825961444 +0000 UTC m=+2296.290948805" watchObservedRunningTime="2026-03-07 07:37:58.823150612 +0000 UTC m=+2297.288137933" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.113115 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.157992 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547818-mvsbs"] Mar 07 07:38:00 crc kubenswrapper[4738]: E0307 07:38:00.158693 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" containerName="swift-ring-rebalance" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.158726 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" containerName="swift-ring-rebalance" Mar 07 07:38:00 crc kubenswrapper[4738]: E0307 07:38:00.158743 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="extract-utilities" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.158754 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="extract-utilities" Mar 07 07:38:00 crc kubenswrapper[4738]: E0307 07:38:00.158803 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.158833 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4738]: E0307 07:38:00.158852 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="extract-content" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.158860 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="extract-content" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.159231 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace31251-4f07-4b03-9f69-aa69d0712538" containerName="registry-server" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.159286 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" containerName="swift-ring-rebalance" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.160106 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.162912 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.162953 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.163056 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.189538 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d"] Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.196587 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-mvsbs"] Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.202123 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d"] Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243079 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243475 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243534 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243594 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243675 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlb2g\" (UniqueName: \"kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243803 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf\") pod \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\" (UID: \"9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9\") " Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.243861 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.244222 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsnph\" (UniqueName: \"kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph\") pod \"auto-csr-approver-29547818-mvsbs\" (UID: \"da05917f-8b20-4841-b237-1a0fe85e93e1\") " pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.244461 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.244768 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.255771 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g" (OuterVolumeSpecName: "kube-api-access-rlb2g") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "kube-api-access-rlb2g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.266891 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.271889 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.285360 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts" (OuterVolumeSpecName: "scripts") pod "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" (UID: "9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346273 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsnph\" (UniqueName: \"kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph\") pod \"auto-csr-approver-29547818-mvsbs\" (UID: \"da05917f-8b20-4841-b237-1a0fe85e93e1\") " pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346359 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346371 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346381 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlb2g\" (UniqueName: \"kubernetes.io/projected/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-kube-api-access-rlb2g\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346392 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.346401 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.363048 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsnph\" (UniqueName: \"kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph\") pod \"auto-csr-approver-29547818-mvsbs\" (UID: \"da05917f-8b20-4841-b237-1a0fe85e93e1\") " pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.394572 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9" path="/var/lib/kubelet/pods/9b42be4d-6b77-4f7a-a0e5-55a93e79c0d9/volumes" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.482917 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.824255 4738 scope.go:117] "RemoveContainer" containerID="ec7ddc3258932f444e8277c7e4a4a42cca50fa0ba08fa4c6e6cfbaa278ef8890" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.824273 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-h8j4d" Mar 07 07:38:00 crc kubenswrapper[4738]: I0307 07:38:00.885225 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-mvsbs"] Mar 07 07:38:00 crc kubenswrapper[4738]: W0307 07:38:00.890940 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podda05917f_8b20_4841_b237_1a0fe85e93e1.slice/crio-d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852 WatchSource:0}: Error finding container d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852: Status 404 returned error can't find the container with id d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852 Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.279835 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc"] Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.280862 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.283061 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.283355 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.294797 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc"] Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.360956 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.361052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.361110 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42cqp\" (UniqueName: \"kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.361256 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.361293 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.361327 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462459 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462560 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462595 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462719 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462803 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.462834 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42cqp\" (UniqueName: \"kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.463372 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.463568 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.463919 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.474096 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.474471 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.481418 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42cqp\" (UniqueName: \"kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp\") pod \"swift-ring-rebalance-debug-6fgjc\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.595059 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:01 crc kubenswrapper[4738]: I0307 07:38:01.840427 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" event={"ID":"da05917f-8b20-4841-b237-1a0fe85e93e1","Type":"ContainerStarted","Data":"d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852"} Mar 07 07:38:02 crc kubenswrapper[4738]: I0307 07:38:02.084849 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc"] Mar 07 07:38:02 crc kubenswrapper[4738]: W0307 07:38:02.093652 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podffbf02a5_929c_48f1_b7cf_8c7923609be7.slice/crio-fbb2497ca21a6ec158db053892be492461f2f0ef592429fabc4328e7dcebae2f WatchSource:0}: Error finding container fbb2497ca21a6ec158db053892be492461f2f0ef592429fabc4328e7dcebae2f: Status 404 returned error can't find the container with id fbb2497ca21a6ec158db053892be492461f2f0ef592429fabc4328e7dcebae2f Mar 07 07:38:02 crc kubenswrapper[4738]: I0307 07:38:02.849883 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" event={"ID":"ffbf02a5-929c-48f1-b7cf-8c7923609be7","Type":"ContainerStarted","Data":"6d895f5d15d978676e98b5607349b675edfda5763122a066c8a6de1819d7274a"} Mar 07 07:38:02 crc kubenswrapper[4738]: I0307 07:38:02.850230 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" event={"ID":"ffbf02a5-929c-48f1-b7cf-8c7923609be7","Type":"ContainerStarted","Data":"fbb2497ca21a6ec158db053892be492461f2f0ef592429fabc4328e7dcebae2f"} Mar 07 07:38:02 crc kubenswrapper[4738]: I0307 07:38:02.878954 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" podStartSLOduration=1.878934203 podStartE2EDuration="1.878934203s" podCreationTimestamp="2026-03-07 07:38:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:02.871485535 +0000 UTC m=+2301.336472856" watchObservedRunningTime="2026-03-07 07:38:02.878934203 +0000 UTC m=+2301.343921524" Mar 07 07:38:03 crc kubenswrapper[4738]: I0307 07:38:03.859189 4738 generic.go:334] "Generic (PLEG): container finished" podID="da05917f-8b20-4841-b237-1a0fe85e93e1" containerID="aa803c8fd3aac46126b5fda6b8a366f844a7664a9f0802a89e2caf80abbf2fae" exitCode=0 Mar 07 07:38:03 crc kubenswrapper[4738]: I0307 07:38:03.859242 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" event={"ID":"da05917f-8b20-4841-b237-1a0fe85e93e1","Type":"ContainerDied","Data":"aa803c8fd3aac46126b5fda6b8a366f844a7664a9f0802a89e2caf80abbf2fae"} Mar 07 07:38:04 crc kubenswrapper[4738]: I0307 07:38:04.869375 4738 generic.go:334] "Generic (PLEG): container finished" podID="ffbf02a5-929c-48f1-b7cf-8c7923609be7" containerID="6d895f5d15d978676e98b5607349b675edfda5763122a066c8a6de1819d7274a" exitCode=0 Mar 07 07:38:04 crc kubenswrapper[4738]: I0307 07:38:04.869580 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" event={"ID":"ffbf02a5-929c-48f1-b7cf-8c7923609be7","Type":"ContainerDied","Data":"6d895f5d15d978676e98b5607349b675edfda5763122a066c8a6de1819d7274a"} Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.159784 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.324658 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsnph\" (UniqueName: \"kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph\") pod \"da05917f-8b20-4841-b237-1a0fe85e93e1\" (UID: \"da05917f-8b20-4841-b237-1a0fe85e93e1\") " Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.333277 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph" (OuterVolumeSpecName: "kube-api-access-tsnph") pod "da05917f-8b20-4841-b237-1a0fe85e93e1" (UID: "da05917f-8b20-4841-b237-1a0fe85e93e1"). InnerVolumeSpecName "kube-api-access-tsnph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.426108 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsnph\" (UniqueName: \"kubernetes.io/projected/da05917f-8b20-4841-b237-1a0fe85e93e1-kube-api-access-tsnph\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.883360 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.883398 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547818-mvsbs" event={"ID":"da05917f-8b20-4841-b237-1a0fe85e93e1","Type":"ContainerDied","Data":"d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852"} Mar 07 07:38:05 crc kubenswrapper[4738]: I0307 07:38:05.883795 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7aef752d89a0c0de1b27326470ac89c6929d9df52a2a63bd23f6384721df852" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.182193 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.215687 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc"] Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.218631 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc"] Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.221972 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-bpwrs"] Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.227690 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547812-bpwrs"] Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.340649 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.340753 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.340795 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.340886 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.341034 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42cqp\" (UniqueName: \"kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.341099 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts\") pod \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\" (UID: \"ffbf02a5-929c-48f1-b7cf-8c7923609be7\") " Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.341657 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.341980 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.346903 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp" (OuterVolumeSpecName: "kube-api-access-42cqp") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "kube-api-access-42cqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.361005 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.362690 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.368962 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts" (OuterVolumeSpecName: "scripts") pod "ffbf02a5-929c-48f1-b7cf-8c7923609be7" (UID: "ffbf02a5-929c-48f1-b7cf-8c7923609be7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.393337 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb3d826-81d0-4364-8e2d-8f6987c5b01d" path="/var/lib/kubelet/pods/efb3d826-81d0-4364-8e2d-8f6987c5b01d/volumes" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.394414 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffbf02a5-929c-48f1-b7cf-8c7923609be7" path="/var/lib/kubelet/pods/ffbf02a5-929c-48f1-b7cf-8c7923609be7/volumes" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443588 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42cqp\" (UniqueName: \"kubernetes.io/projected/ffbf02a5-929c-48f1-b7cf-8c7923609be7-kube-api-access-42cqp\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443617 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443626 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ffbf02a5-929c-48f1-b7cf-8c7923609be7-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443635 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443643 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ffbf02a5-929c-48f1-b7cf-8c7923609be7-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.443651 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ffbf02a5-929c-48f1-b7cf-8c7923609be7-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.892687 4738 scope.go:117] "RemoveContainer" containerID="6d895f5d15d978676e98b5607349b675edfda5763122a066c8a6de1819d7274a" Mar 07 07:38:06 crc kubenswrapper[4738]: I0307 07:38:06.892778 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-6fgjc" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.342919 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8c7st"] Mar 07 07:38:07 crc kubenswrapper[4738]: E0307 07:38:07.343615 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffbf02a5-929c-48f1-b7cf-8c7923609be7" containerName="swift-ring-rebalance" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.343635 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffbf02a5-929c-48f1-b7cf-8c7923609be7" containerName="swift-ring-rebalance" Mar 07 07:38:07 crc kubenswrapper[4738]: E0307 07:38:07.343664 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da05917f-8b20-4841-b237-1a0fe85e93e1" containerName="oc" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.343671 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="da05917f-8b20-4841-b237-1a0fe85e93e1" containerName="oc" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.343821 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="da05917f-8b20-4841-b237-1a0fe85e93e1" containerName="oc" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.343837 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffbf02a5-929c-48f1-b7cf-8c7923609be7" containerName="swift-ring-rebalance" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.344357 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.348741 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.349010 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.374321 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8c7st"] Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460166 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttzjb\" (UniqueName: \"kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460248 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460355 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460423 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460460 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.460544 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562150 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562250 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562281 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562334 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttzjb\" (UniqueName: \"kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562361 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562436 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.562696 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.563042 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.564271 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.571801 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.572327 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.585332 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttzjb\" (UniqueName: \"kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb\") pod \"swift-ring-rebalance-debug-8c7st\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:07 crc kubenswrapper[4738]: I0307 07:38:07.713348 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:08 crc kubenswrapper[4738]: I0307 07:38:08.272374 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8c7st"] Mar 07 07:38:08 crc kubenswrapper[4738]: I0307 07:38:08.915689 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" event={"ID":"3059a132-84ee-428c-847b-125d7eebf2be","Type":"ContainerStarted","Data":"3547d7d0d99a9a179f33599dfbd628b73764e11f7e930b2b19628c889279422b"} Mar 07 07:38:08 crc kubenswrapper[4738]: I0307 07:38:08.915760 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" event={"ID":"3059a132-84ee-428c-847b-125d7eebf2be","Type":"ContainerStarted","Data":"0e2f873ba9a19fe28ca6337819c125331f5823799e0ffb2cf410cb6c45cdfada"} Mar 07 07:38:09 crc kubenswrapper[4738]: I0307 07:38:09.928635 4738 generic.go:334] "Generic (PLEG): container finished" podID="3059a132-84ee-428c-847b-125d7eebf2be" containerID="3547d7d0d99a9a179f33599dfbd628b73764e11f7e930b2b19628c889279422b" exitCode=0 Mar 07 07:38:09 crc kubenswrapper[4738]: I0307 07:38:09.928718 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" event={"ID":"3059a132-84ee-428c-847b-125d7eebf2be","Type":"ContainerDied","Data":"3547d7d0d99a9a179f33599dfbd628b73764e11f7e930b2b19628c889279422b"} Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.219899 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.263718 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8c7st"] Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.268840 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8c7st"] Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317281 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317381 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317432 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317466 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317492 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ttzjb\" (UniqueName: \"kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.317576 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts\") pod \"3059a132-84ee-428c-847b-125d7eebf2be\" (UID: \"3059a132-84ee-428c-847b-125d7eebf2be\") " Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.319488 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.319557 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.337423 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb" (OuterVolumeSpecName: "kube-api-access-ttzjb") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "kube-api-access-ttzjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.340122 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.342700 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.356786 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts" (OuterVolumeSpecName: "scripts") pod "3059a132-84ee-428c-847b-125d7eebf2be" (UID: "3059a132-84ee-428c-847b-125d7eebf2be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418834 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418870 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3059a132-84ee-428c-847b-125d7eebf2be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418882 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3059a132-84ee-428c-847b-125d7eebf2be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418894 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418905 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3059a132-84ee-428c-847b-125d7eebf2be-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.418914 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ttzjb\" (UniqueName: \"kubernetes.io/projected/3059a132-84ee-428c-847b-125d7eebf2be-kube-api-access-ttzjb\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.955655 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e2f873ba9a19fe28ca6337819c125331f5823799e0ffb2cf410cb6c45cdfada" Mar 07 07:38:11 crc kubenswrapper[4738]: I0307 07:38:11.955752 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8c7st" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.382514 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm"] Mar 07 07:38:12 crc kubenswrapper[4738]: E0307 07:38:12.382824 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3059a132-84ee-428c-847b-125d7eebf2be" containerName="swift-ring-rebalance" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.382839 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3059a132-84ee-428c-847b-125d7eebf2be" containerName="swift-ring-rebalance" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.382984 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3059a132-84ee-428c-847b-125d7eebf2be" containerName="swift-ring-rebalance" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.383462 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.392726 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.395962 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.398837 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3059a132-84ee-428c-847b-125d7eebf2be" path="/var/lib/kubelet/pods/3059a132-84ee-428c-847b-125d7eebf2be/volumes" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.399303 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm"] Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.536481 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.536640 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzr2d\" (UniqueName: \"kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.536753 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.536979 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.537030 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.537087 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.638916 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.639014 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.639040 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.639068 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.639085 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.639117 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzr2d\" (UniqueName: \"kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.640601 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.641015 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.641597 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.645689 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.647357 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.681261 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzr2d\" (UniqueName: \"kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d\") pod \"swift-ring-rebalance-debug-rf6rm\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:12 crc kubenswrapper[4738]: I0307 07:38:12.700856 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:13 crc kubenswrapper[4738]: I0307 07:38:13.239738 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm"] Mar 07 07:38:13 crc kubenswrapper[4738]: I0307 07:38:13.978620 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" event={"ID":"557f56da-d636-4aa9-b794-f2e447f40ac4","Type":"ContainerStarted","Data":"dcdb40a89375621d4db4b739c0a7380e9d5a243e5e070ddda976d1b48827fe56"} Mar 07 07:38:14 crc kubenswrapper[4738]: I0307 07:38:14.989433 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" event={"ID":"557f56da-d636-4aa9-b794-f2e447f40ac4","Type":"ContainerStarted","Data":"bbad7c893fe39e697cfa77a86f58e03b48f14de718b06021a61cc13359cee452"} Mar 07 07:38:15 crc kubenswrapper[4738]: I0307 07:38:15.010735 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" podStartSLOduration=3.010717038 podStartE2EDuration="3.010717038s" podCreationTimestamp="2026-03-07 07:38:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:15.004813611 +0000 UTC m=+2313.469800942" watchObservedRunningTime="2026-03-07 07:38:15.010717038 +0000 UTC m=+2313.475704369" Mar 07 07:38:16 crc kubenswrapper[4738]: I0307 07:38:15.999817 4738 generic.go:334] "Generic (PLEG): container finished" podID="557f56da-d636-4aa9-b794-f2e447f40ac4" containerID="bbad7c893fe39e697cfa77a86f58e03b48f14de718b06021a61cc13359cee452" exitCode=0 Mar 07 07:38:16 crc kubenswrapper[4738]: I0307 07:38:16.000193 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" event={"ID":"557f56da-d636-4aa9-b794-f2e447f40ac4","Type":"ContainerDied","Data":"bbad7c893fe39e697cfa77a86f58e03b48f14de718b06021a61cc13359cee452"} Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.275925 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.313850 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm"] Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.318313 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm"] Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410112 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410236 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzr2d\" (UniqueName: \"kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410358 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410400 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410443 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.410496 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift\") pod \"557f56da-d636-4aa9-b794-f2e447f40ac4\" (UID: \"557f56da-d636-4aa9-b794-f2e447f40ac4\") " Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.412312 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.414043 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.426621 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d" (OuterVolumeSpecName: "kube-api-access-xzr2d") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "kube-api-access-xzr2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.437811 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts" (OuterVolumeSpecName: "scripts") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.443232 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.461516 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "557f56da-d636-4aa9-b794-f2e447f40ac4" (UID: "557f56da-d636-4aa9-b794-f2e447f40ac4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512529 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512576 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512589 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/557f56da-d636-4aa9-b794-f2e447f40ac4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512602 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/557f56da-d636-4aa9-b794-f2e447f40ac4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512615 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/557f56da-d636-4aa9-b794-f2e447f40ac4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:17 crc kubenswrapper[4738]: I0307 07:38:17.512627 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzr2d\" (UniqueName: \"kubernetes.io/projected/557f56da-d636-4aa9-b794-f2e447f40ac4-kube-api-access-xzr2d\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.038637 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dcdb40a89375621d4db4b739c0a7380e9d5a243e5e070ddda976d1b48827fe56" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.038694 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rf6rm" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.400974 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="557f56da-d636-4aa9-b794-f2e447f40ac4" path="/var/lib/kubelet/pods/557f56da-d636-4aa9-b794-f2e447f40ac4/volumes" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.446807 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp"] Mar 07 07:38:18 crc kubenswrapper[4738]: E0307 07:38:18.447136 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="557f56da-d636-4aa9-b794-f2e447f40ac4" containerName="swift-ring-rebalance" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.447181 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="557f56da-d636-4aa9-b794-f2e447f40ac4" containerName="swift-ring-rebalance" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.447392 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="557f56da-d636-4aa9-b794-f2e447f40ac4" containerName="swift-ring-rebalance" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.447988 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.451650 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.451995 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.471220 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp"] Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528422 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zwj7\" (UniqueName: \"kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528475 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528519 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528582 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528663 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.528702 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.629953 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zwj7\" (UniqueName: \"kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630010 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630106 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630195 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630267 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630307 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630740 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630938 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.630972 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.633877 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.634103 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.650374 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zwj7\" (UniqueName: \"kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7\") pod \"swift-ring-rebalance-debug-rh2dp\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:18 crc kubenswrapper[4738]: I0307 07:38:18.786238 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:23 crc kubenswrapper[4738]: I0307 07:38:23.093072 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp"] Mar 07 07:38:24 crc kubenswrapper[4738]: I0307 07:38:24.100611 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" event={"ID":"95304b3d-533d-418f-9542-1f06ddd7b914","Type":"ContainerStarted","Data":"a5b46dbd76fc1e2ea069230f1601d39d5ac3ba38684f03a7107a5373d431d084"} Mar 07 07:38:24 crc kubenswrapper[4738]: I0307 07:38:24.101433 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" event={"ID":"95304b3d-533d-418f-9542-1f06ddd7b914","Type":"ContainerStarted","Data":"9492d3e2aabc4af814216a5213b2bf5496342dc899d30e0136147b5aaf5e68bd"} Mar 07 07:38:24 crc kubenswrapper[4738]: I0307 07:38:24.124460 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" podStartSLOduration=6.124442508 podStartE2EDuration="6.124442508s" podCreationTimestamp="2026-03-07 07:38:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:24.123170505 +0000 UTC m=+2322.588157826" watchObservedRunningTime="2026-03-07 07:38:24.124442508 +0000 UTC m=+2322.589429829" Mar 07 07:38:25 crc kubenswrapper[4738]: I0307 07:38:25.112917 4738 generic.go:334] "Generic (PLEG): container finished" podID="95304b3d-533d-418f-9542-1f06ddd7b914" containerID="a5b46dbd76fc1e2ea069230f1601d39d5ac3ba38684f03a7107a5373d431d084" exitCode=0 Mar 07 07:38:25 crc kubenswrapper[4738]: I0307 07:38:25.112967 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" event={"ID":"95304b3d-533d-418f-9542-1f06ddd7b914","Type":"ContainerDied","Data":"a5b46dbd76fc1e2ea069230f1601d39d5ac3ba38684f03a7107a5373d431d084"} Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.397423 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.438284 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp"] Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.448019 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp"] Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578523 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578589 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578674 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578739 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zwj7\" (UniqueName: \"kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578796 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.578862 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts\") pod \"95304b3d-533d-418f-9542-1f06ddd7b914\" (UID: \"95304b3d-533d-418f-9542-1f06ddd7b914\") " Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.579613 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.579618 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.583513 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7" (OuterVolumeSpecName: "kube-api-access-6zwj7") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "kube-api-access-6zwj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.596884 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts" (OuterVolumeSpecName: "scripts") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.598344 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.599696 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "95304b3d-533d-418f-9542-1f06ddd7b914" (UID: "95304b3d-533d-418f-9542-1f06ddd7b914"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680552 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680840 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680855 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/95304b3d-533d-418f-9542-1f06ddd7b914-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680868 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/95304b3d-533d-418f-9542-1f06ddd7b914-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680883 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zwj7\" (UniqueName: \"kubernetes.io/projected/95304b3d-533d-418f-9542-1f06ddd7b914-kube-api-access-6zwj7\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:26 crc kubenswrapper[4738]: I0307 07:38:26.680896 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/95304b3d-533d-418f-9542-1f06ddd7b914-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.134305 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9492d3e2aabc4af814216a5213b2bf5496342dc899d30e0136147b5aaf5e68bd" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.134387 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rh2dp" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.654708 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n"] Mar 07 07:38:27 crc kubenswrapper[4738]: E0307 07:38:27.654990 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95304b3d-533d-418f-9542-1f06ddd7b914" containerName="swift-ring-rebalance" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.655003 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="95304b3d-533d-418f-9542-1f06ddd7b914" containerName="swift-ring-rebalance" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.655214 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="95304b3d-533d-418f-9542-1f06ddd7b914" containerName="swift-ring-rebalance" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.655668 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.658228 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.659472 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.670151 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n"] Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.796731 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.796834 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.796877 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.796935 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tt65v\" (UniqueName: \"kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.796983 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.797017 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898299 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tt65v\" (UniqueName: \"kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898358 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898402 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898469 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898516 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.898560 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.899685 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.899775 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.900247 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.902730 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.902767 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.926286 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tt65v\" (UniqueName: \"kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v\") pod \"swift-ring-rebalance-debug-4hp2n\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:27 crc kubenswrapper[4738]: I0307 07:38:27.972377 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:28 crc kubenswrapper[4738]: I0307 07:38:28.400297 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95304b3d-533d-418f-9542-1f06ddd7b914" path="/var/lib/kubelet/pods/95304b3d-533d-418f-9542-1f06ddd7b914/volumes" Mar 07 07:38:28 crc kubenswrapper[4738]: I0307 07:38:28.401399 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n"] Mar 07 07:38:29 crc kubenswrapper[4738]: I0307 07:38:29.164003 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" event={"ID":"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a","Type":"ContainerStarted","Data":"39b267373222952ff87d6b5f4f9b66f561747b5f437b04f412706bcb4a59cf4d"} Mar 07 07:38:29 crc kubenswrapper[4738]: I0307 07:38:29.164367 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" event={"ID":"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a","Type":"ContainerStarted","Data":"54fa3ab15aaebcdfa6220d6695d92eb81a41a4a18d21c5f7e87a2ab7eabb5990"} Mar 07 07:38:29 crc kubenswrapper[4738]: I0307 07:38:29.188527 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" podStartSLOduration=2.188501184 podStartE2EDuration="2.188501184s" podCreationTimestamp="2026-03-07 07:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:29.181451265 +0000 UTC m=+2327.646438636" watchObservedRunningTime="2026-03-07 07:38:29.188501184 +0000 UTC m=+2327.653488535" Mar 07 07:38:30 crc kubenswrapper[4738]: I0307 07:38:30.179103 4738 generic.go:334] "Generic (PLEG): container finished" podID="fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" containerID="39b267373222952ff87d6b5f4f9b66f561747b5f437b04f412706bcb4a59cf4d" exitCode=0 Mar 07 07:38:30 crc kubenswrapper[4738]: I0307 07:38:30.179146 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" event={"ID":"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a","Type":"ContainerDied","Data":"39b267373222952ff87d6b5f4f9b66f561747b5f437b04f412706bcb4a59cf4d"} Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.521200 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.552600 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n"] Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.557921 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n"] Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654102 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tt65v\" (UniqueName: \"kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654196 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654245 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654330 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654396 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654427 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf\") pod \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\" (UID: \"fa581ac4-eb3a-4fd0-bf9c-8676ec52248a\") " Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.654996 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.655673 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.659762 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v" (OuterVolumeSpecName: "kube-api-access-tt65v") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "kube-api-access-tt65v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.674752 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts" (OuterVolumeSpecName: "scripts") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.677095 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.684472 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" (UID: "fa581ac4-eb3a-4fd0-bf9c-8676ec52248a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756335 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756391 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756402 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756415 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tt65v\" (UniqueName: \"kubernetes.io/projected/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-kube-api-access-tt65v\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756425 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:31 crc kubenswrapper[4738]: I0307 07:38:31.756434 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.199452 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54fa3ab15aaebcdfa6220d6695d92eb81a41a4a18d21c5f7e87a2ab7eabb5990" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.199559 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4hp2n" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.397063 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" path="/var/lib/kubelet/pods/fa581ac4-eb3a-4fd0-bf9c-8676ec52248a/volumes" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.726345 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs"] Mar 07 07:38:32 crc kubenswrapper[4738]: E0307 07:38:32.726898 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" containerName="swift-ring-rebalance" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.726910 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" containerName="swift-ring-rebalance" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.727056 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa581ac4-eb3a-4fd0-bf9c-8676ec52248a" containerName="swift-ring-rebalance" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.727559 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.729276 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.729455 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.744213 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs"] Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.875595 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.875649 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.875838 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.875993 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.876119 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2skg\" (UniqueName: \"kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.876341 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.978315 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2skg\" (UniqueName: \"kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.978402 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.978446 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.978472 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.979035 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.979304 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.979403 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.979447 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.980615 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.984267 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.990785 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:32 crc kubenswrapper[4738]: I0307 07:38:32.994906 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2skg\" (UniqueName: \"kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg\") pod \"swift-ring-rebalance-debug-bj5bs\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:33 crc kubenswrapper[4738]: I0307 07:38:33.050752 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:33 crc kubenswrapper[4738]: I0307 07:38:33.259999 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs"] Mar 07 07:38:33 crc kubenswrapper[4738]: W0307 07:38:33.264182 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bfe1813_24de_4b7f_bfdd_028acad03a01.slice/crio-7723f1366880e0fe0a63cbcc838ed518318c9a4ca27f15cefd7db4fbd340a183 WatchSource:0}: Error finding container 7723f1366880e0fe0a63cbcc838ed518318c9a4ca27f15cefd7db4fbd340a183: Status 404 returned error can't find the container with id 7723f1366880e0fe0a63cbcc838ed518318c9a4ca27f15cefd7db4fbd340a183 Mar 07 07:38:34 crc kubenswrapper[4738]: I0307 07:38:34.222803 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" event={"ID":"9bfe1813-24de-4b7f-bfdd-028acad03a01","Type":"ContainerStarted","Data":"49c0a505c0be687a79cbbfa717da8f6bb0a6ddbc00a465d224a2de7d385f1cfe"} Mar 07 07:38:34 crc kubenswrapper[4738]: I0307 07:38:34.223235 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" event={"ID":"9bfe1813-24de-4b7f-bfdd-028acad03a01","Type":"ContainerStarted","Data":"7723f1366880e0fe0a63cbcc838ed518318c9a4ca27f15cefd7db4fbd340a183"} Mar 07 07:38:34 crc kubenswrapper[4738]: I0307 07:38:34.249807 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" podStartSLOduration=2.249784994 podStartE2EDuration="2.249784994s" podCreationTimestamp="2026-03-07 07:38:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:34.242788828 +0000 UTC m=+2332.707776159" watchObservedRunningTime="2026-03-07 07:38:34.249784994 +0000 UTC m=+2332.714772315" Mar 07 07:38:35 crc kubenswrapper[4738]: I0307 07:38:35.233779 4738 generic.go:334] "Generic (PLEG): container finished" podID="9bfe1813-24de-4b7f-bfdd-028acad03a01" containerID="49c0a505c0be687a79cbbfa717da8f6bb0a6ddbc00a465d224a2de7d385f1cfe" exitCode=0 Mar 07 07:38:35 crc kubenswrapper[4738]: I0307 07:38:35.233872 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" event={"ID":"9bfe1813-24de-4b7f-bfdd-028acad03a01","Type":"ContainerDied","Data":"49c0a505c0be687a79cbbfa717da8f6bb0a6ddbc00a465d224a2de7d385f1cfe"} Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.567180 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.601706 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs"] Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.606514 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs"] Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741413 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741525 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z2skg\" (UniqueName: \"kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741559 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741677 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741757 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.741801 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts\") pod \"9bfe1813-24de-4b7f-bfdd-028acad03a01\" (UID: \"9bfe1813-24de-4b7f-bfdd-028acad03a01\") " Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.742840 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.743468 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.752336 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg" (OuterVolumeSpecName: "kube-api-access-z2skg") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "kube-api-access-z2skg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.770403 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts" (OuterVolumeSpecName: "scripts") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.772618 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.778004 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9bfe1813-24de-4b7f-bfdd-028acad03a01" (UID: "9bfe1813-24de-4b7f-bfdd-028acad03a01"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843211 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843248 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9bfe1813-24de-4b7f-bfdd-028acad03a01-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843261 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843274 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9bfe1813-24de-4b7f-bfdd-028acad03a01-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843288 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z2skg\" (UniqueName: \"kubernetes.io/projected/9bfe1813-24de-4b7f-bfdd-028acad03a01-kube-api-access-z2skg\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:36 crc kubenswrapper[4738]: I0307 07:38:36.843303 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9bfe1813-24de-4b7f-bfdd-028acad03a01-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.259604 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7723f1366880e0fe0a63cbcc838ed518318c9a4ca27f15cefd7db4fbd340a183" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.259706 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bj5bs" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.753673 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn"] Mar 07 07:38:37 crc kubenswrapper[4738]: E0307 07:38:37.753927 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfe1813-24de-4b7f-bfdd-028acad03a01" containerName="swift-ring-rebalance" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.753938 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfe1813-24de-4b7f-bfdd-028acad03a01" containerName="swift-ring-rebalance" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.754109 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfe1813-24de-4b7f-bfdd-028acad03a01" containerName="swift-ring-rebalance" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.754630 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.756965 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.757847 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.760666 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.760799 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.760985 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.761044 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv947\" (UniqueName: \"kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.761149 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.761324 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.780304 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn"] Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.862902 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863089 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863177 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863293 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863344 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv947\" (UniqueName: \"kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863426 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.863873 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.864246 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.865026 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.871767 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.875839 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:37 crc kubenswrapper[4738]: I0307 07:38:37.889174 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv947\" (UniqueName: \"kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947\") pod \"swift-ring-rebalance-debug-hq7rn\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:38 crc kubenswrapper[4738]: I0307 07:38:38.085516 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:38 crc kubenswrapper[4738]: I0307 07:38:38.296858 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn"] Mar 07 07:38:38 crc kubenswrapper[4738]: I0307 07:38:38.396067 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfe1813-24de-4b7f-bfdd-028acad03a01" path="/var/lib/kubelet/pods/9bfe1813-24de-4b7f-bfdd-028acad03a01/volumes" Mar 07 07:38:39 crc kubenswrapper[4738]: I0307 07:38:39.281979 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" event={"ID":"060eb169-ef2f-4f24-b3c1-d1e273c9b4db","Type":"ContainerStarted","Data":"6eab9ec531abbd2644ab680063aed1693ac3bb8f83c9d53d28eb1525b47154c7"} Mar 07 07:38:39 crc kubenswrapper[4738]: I0307 07:38:39.282454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" event={"ID":"060eb169-ef2f-4f24-b3c1-d1e273c9b4db","Type":"ContainerStarted","Data":"e630f7d44edc5571f2c22038aca10ab4bbc6c27b4e8876ce9875e7161d04afde"} Mar 07 07:38:39 crc kubenswrapper[4738]: I0307 07:38:39.314753 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" podStartSLOduration=2.314736763 podStartE2EDuration="2.314736763s" podCreationTimestamp="2026-03-07 07:38:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:39.308687431 +0000 UTC m=+2337.773674762" watchObservedRunningTime="2026-03-07 07:38:39.314736763 +0000 UTC m=+2337.779724084" Mar 07 07:38:40 crc kubenswrapper[4738]: I0307 07:38:40.300297 4738 generic.go:334] "Generic (PLEG): container finished" podID="060eb169-ef2f-4f24-b3c1-d1e273c9b4db" containerID="6eab9ec531abbd2644ab680063aed1693ac3bb8f83c9d53d28eb1525b47154c7" exitCode=0 Mar 07 07:38:40 crc kubenswrapper[4738]: I0307 07:38:40.300380 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" event={"ID":"060eb169-ef2f-4f24-b3c1-d1e273c9b4db","Type":"ContainerDied","Data":"6eab9ec531abbd2644ab680063aed1693ac3bb8f83c9d53d28eb1525b47154c7"} Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.642381 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.672635 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn"] Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.680557 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn"] Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.820589 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.820731 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.820811 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.820917 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rv947\" (UniqueName: \"kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.820966 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.821025 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf\") pod \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\" (UID: \"060eb169-ef2f-4f24-b3c1-d1e273c9b4db\") " Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.822012 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.822202 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.833432 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947" (OuterVolumeSpecName: "kube-api-access-rv947") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "kube-api-access-rv947". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.844591 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts" (OuterVolumeSpecName: "scripts") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.846812 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.860732 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "060eb169-ef2f-4f24-b3c1-d1e273c9b4db" (UID: "060eb169-ef2f-4f24-b3c1-d1e273c9b4db"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923610 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923657 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923673 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rv947\" (UniqueName: \"kubernetes.io/projected/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-kube-api-access-rv947\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923695 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923713 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.923725 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/060eb169-ef2f-4f24-b3c1-d1e273c9b4db-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.977450 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-629nq"] Mar 07 07:38:41 crc kubenswrapper[4738]: E0307 07:38:41.977972 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060eb169-ef2f-4f24-b3c1-d1e273c9b4db" containerName="swift-ring-rebalance" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.977997 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="060eb169-ef2f-4f24-b3c1-d1e273c9b4db" containerName="swift-ring-rebalance" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.978198 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="060eb169-ef2f-4f24-b3c1-d1e273c9b4db" containerName="swift-ring-rebalance" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.979516 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:41 crc kubenswrapper[4738]: I0307 07:38:41.997619 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-629nq"] Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.126857 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-utilities\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.126917 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-catalog-content\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.126959 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxd94\" (UniqueName: \"kubernetes.io/projected/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-kube-api-access-bxd94\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.228409 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-utilities\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.228477 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-catalog-content\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.228514 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxd94\" (UniqueName: \"kubernetes.io/projected/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-kube-api-access-bxd94\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.229576 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-catalog-content\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.229668 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-utilities\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.245325 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxd94\" (UniqueName: \"kubernetes.io/projected/cd6468af-0f60-43a2-823a-abdfd6f3fbb4-kube-api-access-bxd94\") pod \"community-operators-629nq\" (UID: \"cd6468af-0f60-43a2-823a-abdfd6f3fbb4\") " pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.302622 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.324533 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e630f7d44edc5571f2c22038aca10ab4bbc6c27b4e8876ce9875e7161d04afde" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.324608 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-hq7rn" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.397295 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060eb169-ef2f-4f24-b3c1-d1e273c9b4db" path="/var/lib/kubelet/pods/060eb169-ef2f-4f24-b3c1-d1e273c9b4db/volumes" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.569200 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-629nq"] Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.835183 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jnfth"] Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.836196 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.839472 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.839692 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.845554 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jnfth"] Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850183 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850215 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850309 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850335 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850379 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59mxn\" (UniqueName: \"kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.850434 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951306 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951398 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951419 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951454 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951481 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951524 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59mxn\" (UniqueName: \"kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.951747 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.952635 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.952929 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.957990 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.958523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:42 crc kubenswrapper[4738]: I0307 07:38:42.980500 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59mxn\" (UniqueName: \"kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn\") pod \"swift-ring-rebalance-debug-jnfth\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:43 crc kubenswrapper[4738]: I0307 07:38:43.149578 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:43 crc kubenswrapper[4738]: I0307 07:38:43.361846 4738 generic.go:334] "Generic (PLEG): container finished" podID="cd6468af-0f60-43a2-823a-abdfd6f3fbb4" containerID="a5e9a4077741f168f5291a321ad46df83647fcbed8f51991043188c0a6d46ab7" exitCode=0 Mar 07 07:38:43 crc kubenswrapper[4738]: I0307 07:38:43.362139 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-629nq" event={"ID":"cd6468af-0f60-43a2-823a-abdfd6f3fbb4","Type":"ContainerDied","Data":"a5e9a4077741f168f5291a321ad46df83647fcbed8f51991043188c0a6d46ab7"} Mar 07 07:38:43 crc kubenswrapper[4738]: I0307 07:38:43.362191 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-629nq" event={"ID":"cd6468af-0f60-43a2-823a-abdfd6f3fbb4","Type":"ContainerStarted","Data":"61c7a82a5d2c24af6dc8b154e8336f3f52768758e2855423fb5fba049b6ed9c8"} Mar 07 07:38:43 crc kubenswrapper[4738]: I0307 07:38:43.598930 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jnfth"] Mar 07 07:38:43 crc kubenswrapper[4738]: W0307 07:38:43.603960 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4dda03e_caeb_46a3_841d_a44da30fa14b.slice/crio-e84db39779d78328c102731edbc7b6e9771af48f6572d2006859913e8c79f4e9 WatchSource:0}: Error finding container e84db39779d78328c102731edbc7b6e9771af48f6572d2006859913e8c79f4e9: Status 404 returned error can't find the container with id e84db39779d78328c102731edbc7b6e9771af48f6572d2006859913e8c79f4e9 Mar 07 07:38:44 crc kubenswrapper[4738]: I0307 07:38:44.371382 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" event={"ID":"d4dda03e-caeb-46a3-841d-a44da30fa14b","Type":"ContainerStarted","Data":"c1c073b1c2df0ddd6393eb070b8e8267f2f1ad27bdae61d288dbd1ea0e6211d0"} Mar 07 07:38:44 crc kubenswrapper[4738]: I0307 07:38:44.371716 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" event={"ID":"d4dda03e-caeb-46a3-841d-a44da30fa14b","Type":"ContainerStarted","Data":"e84db39779d78328c102731edbc7b6e9771af48f6572d2006859913e8c79f4e9"} Mar 07 07:38:44 crc kubenswrapper[4738]: I0307 07:38:44.392429 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" podStartSLOduration=2.392415181 podStartE2EDuration="2.392415181s" podCreationTimestamp="2026-03-07 07:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:44.387864859 +0000 UTC m=+2342.852852190" watchObservedRunningTime="2026-03-07 07:38:44.392415181 +0000 UTC m=+2342.857402502" Mar 07 07:38:45 crc kubenswrapper[4738]: I0307 07:38:45.382878 4738 generic.go:334] "Generic (PLEG): container finished" podID="d4dda03e-caeb-46a3-841d-a44da30fa14b" containerID="c1c073b1c2df0ddd6393eb070b8e8267f2f1ad27bdae61d288dbd1ea0e6211d0" exitCode=0 Mar 07 07:38:45 crc kubenswrapper[4738]: I0307 07:38:45.382994 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" event={"ID":"d4dda03e-caeb-46a3-841d-a44da30fa14b","Type":"ContainerDied","Data":"c1c073b1c2df0ddd6393eb070b8e8267f2f1ad27bdae61d288dbd1ea0e6211d0"} Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.644470 4738 scope.go:117] "RemoveContainer" containerID="804d9e3e7c04a503f67ef04c1aacfa86a98005e3b91b55935dc1a47e60327a46" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.690089 4738 scope.go:117] "RemoveContainer" containerID="4a2d958aa533a5e510557c8fefeefec4d81ad2c944e19cb36d07a5102435935b" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.731097 4738 scope.go:117] "RemoveContainer" containerID="6036dccf516273422d89097117e7260946f593754499f5e83ae63896ad481b34" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.823879 4738 scope.go:117] "RemoveContainer" containerID="d3f99dcec3b3cf845a344418fc48b7c6f27393dad5fc0da1bea1fe962412cf64" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.824811 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.870593 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jnfth"] Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.878858 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jnfth"] Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.916866 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.916920 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.916974 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.917014 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.917173 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.917208 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59mxn\" (UniqueName: \"kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn\") pod \"d4dda03e-caeb-46a3-841d-a44da30fa14b\" (UID: \"d4dda03e-caeb-46a3-841d-a44da30fa14b\") " Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.917715 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.918019 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.922883 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn" (OuterVolumeSpecName: "kube-api-access-59mxn") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "kube-api-access-59mxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.939055 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.939330 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts" (OuterVolumeSpecName: "scripts") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:46 crc kubenswrapper[4738]: I0307 07:38:46.939875 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d4dda03e-caeb-46a3-841d-a44da30fa14b" (UID: "d4dda03e-caeb-46a3-841d-a44da30fa14b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019449 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d4dda03e-caeb-46a3-841d-a44da30fa14b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019537 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59mxn\" (UniqueName: \"kubernetes.io/projected/d4dda03e-caeb-46a3-841d-a44da30fa14b-kube-api-access-59mxn\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019558 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019576 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019592 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d4dda03e-caeb-46a3-841d-a44da30fa14b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.019606 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d4dda03e-caeb-46a3-841d-a44da30fa14b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.401644 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e84db39779d78328c102731edbc7b6e9771af48f6572d2006859913e8c79f4e9" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.401665 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jnfth" Mar 07 07:38:47 crc kubenswrapper[4738]: I0307 07:38:47.403699 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-629nq" event={"ID":"cd6468af-0f60-43a2-823a-abdfd6f3fbb4","Type":"ContainerStarted","Data":"e99d318699fee6229d2057a01cc71bc4edbdd93db05e9a9dec176800fe374c99"} Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.076472 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj"] Mar 07 07:38:48 crc kubenswrapper[4738]: E0307 07:38:48.077051 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dda03e-caeb-46a3-841d-a44da30fa14b" containerName="swift-ring-rebalance" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.077064 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dda03e-caeb-46a3-841d-a44da30fa14b" containerName="swift-ring-rebalance" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.077250 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dda03e-caeb-46a3-841d-a44da30fa14b" containerName="swift-ring-rebalance" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.078002 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.080290 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.080387 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.084408 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj"] Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.240762 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.240920 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.240944 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crc6t\" (UniqueName: \"kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.240971 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.240998 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.241018 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342512 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342627 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342654 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crc6t\" (UniqueName: \"kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342688 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342730 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.342762 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.343600 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.343730 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.344040 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.348902 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.348964 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.368679 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crc6t\" (UniqueName: \"kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t\") pod \"swift-ring-rebalance-debug-b6dbj\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.396025 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.403090 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dda03e-caeb-46a3-841d-a44da30fa14b" path="/var/lib/kubelet/pods/d4dda03e-caeb-46a3-841d-a44da30fa14b/volumes" Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.419040 4738 generic.go:334] "Generic (PLEG): container finished" podID="cd6468af-0f60-43a2-823a-abdfd6f3fbb4" containerID="e99d318699fee6229d2057a01cc71bc4edbdd93db05e9a9dec176800fe374c99" exitCode=0 Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.419105 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-629nq" event={"ID":"cd6468af-0f60-43a2-823a-abdfd6f3fbb4","Type":"ContainerDied","Data":"e99d318699fee6229d2057a01cc71bc4edbdd93db05e9a9dec176800fe374c99"} Mar 07 07:38:48 crc kubenswrapper[4738]: I0307 07:38:48.884568 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj"] Mar 07 07:38:49 crc kubenswrapper[4738]: I0307 07:38:49.429694 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" event={"ID":"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b","Type":"ContainerStarted","Data":"1607677b370d069c3e32ad144c9f70f8fdf55a2f1385721dc47e6408bd2b8404"} Mar 07 07:38:49 crc kubenswrapper[4738]: I0307 07:38:49.429985 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" event={"ID":"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b","Type":"ContainerStarted","Data":"70354beb76663fdec94d25b0a2a0fdc1799b3697f8367faa77ad8af7f57b82ee"} Mar 07 07:38:49 crc kubenswrapper[4738]: I0307 07:38:49.432906 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-629nq" event={"ID":"cd6468af-0f60-43a2-823a-abdfd6f3fbb4","Type":"ContainerStarted","Data":"dd857325608f08c876960dfa1534ccfb77ddf96156f65f620c829ada5deed657"} Mar 07 07:38:49 crc kubenswrapper[4738]: I0307 07:38:49.453146 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" podStartSLOduration=1.453126887 podStartE2EDuration="1.453126887s" podCreationTimestamp="2026-03-07 07:38:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:49.452714655 +0000 UTC m=+2347.917701996" watchObservedRunningTime="2026-03-07 07:38:49.453126887 +0000 UTC m=+2347.918114208" Mar 07 07:38:49 crc kubenswrapper[4738]: I0307 07:38:49.480921 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-629nq" podStartSLOduration=2.847038689 podStartE2EDuration="8.480904207s" podCreationTimestamp="2026-03-07 07:38:41 +0000 UTC" firstStartedPulling="2026-03-07 07:38:43.364221486 +0000 UTC m=+2341.829208807" lastFinishedPulling="2026-03-07 07:38:48.998087004 +0000 UTC m=+2347.463074325" observedRunningTime="2026-03-07 07:38:49.47537677 +0000 UTC m=+2347.940364091" watchObservedRunningTime="2026-03-07 07:38:49.480904207 +0000 UTC m=+2347.945891528" Mar 07 07:38:50 crc kubenswrapper[4738]: I0307 07:38:50.442639 4738 generic.go:334] "Generic (PLEG): container finished" podID="3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" containerID="1607677b370d069c3e32ad144c9f70f8fdf55a2f1385721dc47e6408bd2b8404" exitCode=0 Mar 07 07:38:50 crc kubenswrapper[4738]: I0307 07:38:50.442740 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" event={"ID":"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b","Type":"ContainerDied","Data":"1607677b370d069c3e32ad144c9f70f8fdf55a2f1385721dc47e6408bd2b8404"} Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.797437 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.838970 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj"] Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.844374 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj"] Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896258 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crc6t\" (UniqueName: \"kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896294 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896322 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896364 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896499 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.896524 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf\") pod \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\" (UID: \"3a60ed25-c86e-46d6-9fc4-cc1d33f2864b\") " Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.897229 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.897315 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.904367 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t" (OuterVolumeSpecName: "kube-api-access-crc6t") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "kube-api-access-crc6t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.920761 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts" (OuterVolumeSpecName: "scripts") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.942848 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:51 crc kubenswrapper[4738]: I0307 07:38:51.944083 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" (UID: "3a60ed25-c86e-46d6-9fc4-cc1d33f2864b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:51.999898 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crc6t\" (UniqueName: \"kubernetes.io/projected/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-kube-api-access-crc6t\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:51.999983 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.000006 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.000022 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.000069 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.000085 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.303784 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.303836 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.364891 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.395405 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" path="/var/lib/kubelet/pods/3a60ed25-c86e-46d6-9fc4-cc1d33f2864b/volumes" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.463664 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-b6dbj" Mar 07 07:38:52 crc kubenswrapper[4738]: I0307 07:38:52.463665 4738 scope.go:117] "RemoveContainer" containerID="1607677b370d069c3e32ad144c9f70f8fdf55a2f1385721dc47e6408bd2b8404" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.027219 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp"] Mar 07 07:38:53 crc kubenswrapper[4738]: E0307 07:38:53.027484 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" containerName="swift-ring-rebalance" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.027497 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" containerName="swift-ring-rebalance" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.027641 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a60ed25-c86e-46d6-9fc4-cc1d33f2864b" containerName="swift-ring-rebalance" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.028075 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.029806 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.032255 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.040065 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp"] Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117642 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117744 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117776 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nq2\" (UniqueName: \"kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117823 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117843 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.117929 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.219817 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.219881 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.219912 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.219961 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.220048 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.220093 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nq2\" (UniqueName: \"kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.220843 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.221267 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.221426 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.226022 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.231804 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.240742 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nq2\" (UniqueName: \"kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2\") pod \"swift-ring-rebalance-debug-mf9cp\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.361136 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:53 crc kubenswrapper[4738]: I0307 07:38:53.761869 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp"] Mar 07 07:38:54 crc kubenswrapper[4738]: I0307 07:38:54.490415 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" event={"ID":"28efa262-e082-42dd-8b79-be35a73c282e","Type":"ContainerStarted","Data":"5dd28070e510fc57d943386571d5f4061c9362505533716af779702c2b32a921"} Mar 07 07:38:54 crc kubenswrapper[4738]: I0307 07:38:54.491568 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" event={"ID":"28efa262-e082-42dd-8b79-be35a73c282e","Type":"ContainerStarted","Data":"dde7b131f72735aee3a0baf153828ef4ec07752138321c508962edc5c7ed944a"} Mar 07 07:38:54 crc kubenswrapper[4738]: I0307 07:38:54.515963 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" podStartSLOduration=1.515946939 podStartE2EDuration="1.515946939s" podCreationTimestamp="2026-03-07 07:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:54.512052495 +0000 UTC m=+2352.977039826" watchObservedRunningTime="2026-03-07 07:38:54.515946939 +0000 UTC m=+2352.980934260" Mar 07 07:38:55 crc kubenswrapper[4738]: I0307 07:38:55.505941 4738 generic.go:334] "Generic (PLEG): container finished" podID="28efa262-e082-42dd-8b79-be35a73c282e" containerID="5dd28070e510fc57d943386571d5f4061c9362505533716af779702c2b32a921" exitCode=0 Mar 07 07:38:55 crc kubenswrapper[4738]: I0307 07:38:55.506055 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" event={"ID":"28efa262-e082-42dd-8b79-be35a73c282e","Type":"ContainerDied","Data":"5dd28070e510fc57d943386571d5f4061c9362505533716af779702c2b32a921"} Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.868150 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.897452 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp"] Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.904572 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp"] Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976314 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976435 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976483 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976510 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9nq2\" (UniqueName: \"kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976570 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976655 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf\") pod \"28efa262-e082-42dd-8b79-be35a73c282e\" (UID: \"28efa262-e082-42dd-8b79-be35a73c282e\") " Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.976960 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.977548 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.982393 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2" (OuterVolumeSpecName: "kube-api-access-n9nq2") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "kube-api-access-n9nq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:38:56 crc kubenswrapper[4738]: I0307 07:38:56.997355 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.005587 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts" (OuterVolumeSpecName: "scripts") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.017405 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "28efa262-e082-42dd-8b79-be35a73c282e" (UID: "28efa262-e082-42dd-8b79-be35a73c282e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.078846 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/28efa262-e082-42dd-8b79-be35a73c282e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.078917 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.078935 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/28efa262-e082-42dd-8b79-be35a73c282e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.079014 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9nq2\" (UniqueName: \"kubernetes.io/projected/28efa262-e082-42dd-8b79-be35a73c282e-kube-api-access-n9nq2\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.079029 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.079041 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/28efa262-e082-42dd-8b79-be35a73c282e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.530127 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dde7b131f72735aee3a0baf153828ef4ec07752138321c508962edc5c7ed944a" Mar 07 07:38:57 crc kubenswrapper[4738]: I0307 07:38:57.530255 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mf9cp" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.079859 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg"] Mar 07 07:38:58 crc kubenswrapper[4738]: E0307 07:38:58.080235 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28efa262-e082-42dd-8b79-be35a73c282e" containerName="swift-ring-rebalance" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.080249 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="28efa262-e082-42dd-8b79-be35a73c282e" containerName="swift-ring-rebalance" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.080401 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="28efa262-e082-42dd-8b79-be35a73c282e" containerName="swift-ring-rebalance" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.080871 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.083272 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.085993 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.099941 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg"] Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.195969 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.196133 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.196196 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.196285 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j449h\" (UniqueName: \"kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.196410 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.196475 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.297657 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.297895 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.297948 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j449h\" (UniqueName: \"kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.297970 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.297989 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.298040 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.298467 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.298961 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.299371 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.305304 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.309484 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.320853 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j449h\" (UniqueName: \"kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h\") pod \"swift-ring-rebalance-debug-w5ktg\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.394736 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28efa262-e082-42dd-8b79-be35a73c282e" path="/var/lib/kubelet/pods/28efa262-e082-42dd-8b79-be35a73c282e/volumes" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.403216 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:38:58 crc kubenswrapper[4738]: I0307 07:38:58.619494 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg"] Mar 07 07:38:59 crc kubenswrapper[4738]: I0307 07:38:59.558250 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" event={"ID":"4f375d3f-b229-4f84-8caa-3c5e96652f7e","Type":"ContainerStarted","Data":"e2d7c30877f3b3df6c1c11a7650b6b88908ac327168c8b37e4f906573c64ffba"} Mar 07 07:38:59 crc kubenswrapper[4738]: I0307 07:38:59.558296 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" event={"ID":"4f375d3f-b229-4f84-8caa-3c5e96652f7e","Type":"ContainerStarted","Data":"d36ded88871db82d0594bb6e7601ca382343669d9df01838aeb7f7e5394500ae"} Mar 07 07:38:59 crc kubenswrapper[4738]: I0307 07:38:59.597491 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" podStartSLOduration=1.597455821 podStartE2EDuration="1.597455821s" podCreationTimestamp="2026-03-07 07:38:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:38:59.580472898 +0000 UTC m=+2358.045460249" watchObservedRunningTime="2026-03-07 07:38:59.597455821 +0000 UTC m=+2358.062443182" Mar 07 07:39:00 crc kubenswrapper[4738]: I0307 07:39:00.571386 4738 generic.go:334] "Generic (PLEG): container finished" podID="4f375d3f-b229-4f84-8caa-3c5e96652f7e" containerID="e2d7c30877f3b3df6c1c11a7650b6b88908ac327168c8b37e4f906573c64ffba" exitCode=0 Mar 07 07:39:00 crc kubenswrapper[4738]: I0307 07:39:00.571441 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" event={"ID":"4f375d3f-b229-4f84-8caa-3c5e96652f7e","Type":"ContainerDied","Data":"e2d7c30877f3b3df6c1c11a7650b6b88908ac327168c8b37e4f906573c64ffba"} Mar 07 07:39:01 crc kubenswrapper[4738]: I0307 07:39:01.916682 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:39:01 crc kubenswrapper[4738]: I0307 07:39:01.957673 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg"] Mar 07 07:39:01 crc kubenswrapper[4738]: I0307 07:39:01.965288 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg"] Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.063832 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.063880 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j449h\" (UniqueName: \"kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.063907 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.063946 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.064508 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.064024 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.064674 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts\") pod \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\" (UID: \"4f375d3f-b229-4f84-8caa-3c5e96652f7e\") " Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.065098 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.065582 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.065605 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/4f375d3f-b229-4f84-8caa-3c5e96652f7e-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.075308 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h" (OuterVolumeSpecName: "kube-api-access-j449h") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "kube-api-access-j449h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.090421 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.095674 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts" (OuterVolumeSpecName: "scripts") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.095997 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "4f375d3f-b229-4f84-8caa-3c5e96652f7e" (UID: "4f375d3f-b229-4f84-8caa-3c5e96652f7e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.166765 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.166799 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j449h\" (UniqueName: \"kubernetes.io/projected/4f375d3f-b229-4f84-8caa-3c5e96652f7e-kube-api-access-j449h\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.166813 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/4f375d3f-b229-4f84-8caa-3c5e96652f7e-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.166825 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f375d3f-b229-4f84-8caa-3c5e96652f7e-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.371750 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-629nq" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.428645 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f375d3f-b229-4f84-8caa-3c5e96652f7e" path="/var/lib/kubelet/pods/4f375d3f-b229-4f84-8caa-3c5e96652f7e/volumes" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.484595 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-629nq"] Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.527136 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.527423 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k7wrp" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="registry-server" containerID="cri-o://69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" gracePeriod=2 Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.595965 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-w5ktg" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.595964 4738 scope.go:117] "RemoveContainer" containerID="e2d7c30877f3b3df6c1c11a7650b6b88908ac327168c8b37e4f906573c64ffba" Mar 07 07:39:02 crc kubenswrapper[4738]: E0307 07:39:02.611044 4738 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c is running failed: container process not found" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 07:39:02 crc kubenswrapper[4738]: E0307 07:39:02.611511 4738 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c is running failed: container process not found" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 07:39:02 crc kubenswrapper[4738]: E0307 07:39:02.611762 4738 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c is running failed: container process not found" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" cmd=["grpc_health_probe","-addr=:50051"] Mar 07 07:39:02 crc kubenswrapper[4738]: E0307 07:39:02.611793 4738 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-k7wrp" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="registry-server" Mar 07 07:39:02 crc kubenswrapper[4738]: I0307 07:39:02.946759 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.082709 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content\") pod \"8e72c18f-8de1-4466-9614-d2b6b27749a3\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.082776 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvb95\" (UniqueName: \"kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95\") pod \"8e72c18f-8de1-4466-9614-d2b6b27749a3\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.082804 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities\") pod \"8e72c18f-8de1-4466-9614-d2b6b27749a3\" (UID: \"8e72c18f-8de1-4466-9614-d2b6b27749a3\") " Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.083557 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities" (OuterVolumeSpecName: "utilities") pod "8e72c18f-8de1-4466-9614-d2b6b27749a3" (UID: "8e72c18f-8de1-4466-9614-d2b6b27749a3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.099955 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95" (OuterVolumeSpecName: "kube-api-access-cvb95") pod "8e72c18f-8de1-4466-9614-d2b6b27749a3" (UID: "8e72c18f-8de1-4466-9614-d2b6b27749a3"). InnerVolumeSpecName "kube-api-access-cvb95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.114237 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn"] Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.114770 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f375d3f-b229-4f84-8caa-3c5e96652f7e" containerName="swift-ring-rebalance" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.114792 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f375d3f-b229-4f84-8caa-3c5e96652f7e" containerName="swift-ring-rebalance" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.114810 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="extract-content" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.114838 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="extract-content" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.114865 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="registry-server" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.114874 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="registry-server" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.114923 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="extract-utilities" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.114932 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="extract-utilities" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.115176 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f375d3f-b229-4f84-8caa-3c5e96652f7e" containerName="swift-ring-rebalance" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.115200 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerName="registry-server" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.115832 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.121651 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn"] Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.126280 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.126556 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.150555 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e72c18f-8de1-4466-9614-d2b6b27749a3" (UID: "8e72c18f-8de1-4466-9614-d2b6b27749a3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185132 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185261 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185299 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185424 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tthr\" (UniqueName: \"kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185603 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185642 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185745 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.185759 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvb95\" (UniqueName: \"kubernetes.io/projected/8e72c18f-8de1-4466-9614-d2b6b27749a3-kube-api-access-cvb95\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.186012 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e72c18f-8de1-4466-9614-d2b6b27749a3-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287655 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tthr\" (UniqueName: \"kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287731 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287756 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287791 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287818 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.287837 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.288645 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.288733 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.289112 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.292850 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.293143 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.317689 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tthr\" (UniqueName: \"kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr\") pod \"swift-ring-rebalance-debug-jg4qn\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.429798 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.628331 4738 generic.go:334] "Generic (PLEG): container finished" podID="8e72c18f-8de1-4466-9614-d2b6b27749a3" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" exitCode=0 Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.628682 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerDied","Data":"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c"} Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.628711 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k7wrp" event={"ID":"8e72c18f-8de1-4466-9614-d2b6b27749a3","Type":"ContainerDied","Data":"27628b148e6100720b5cb58ef20fdc979b2d793fe6ab2a0401b11a689be438eb"} Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.628729 4738 scope.go:117] "RemoveContainer" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.628858 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k7wrp" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.629523 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn"] Mar 07 07:39:03 crc kubenswrapper[4738]: W0307 07:39:03.635765 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podafb48577_3d58_4f53_abe5_8f5cb8deb7c4.slice/crio-d50df3845ef4b9a5830f5e7f185430790e08a4ea3f12b3a51daba4c6e08947e2 WatchSource:0}: Error finding container d50df3845ef4b9a5830f5e7f185430790e08a4ea3f12b3a51daba4c6e08947e2: Status 404 returned error can't find the container with id d50df3845ef4b9a5830f5e7f185430790e08a4ea3f12b3a51daba4c6e08947e2 Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.659257 4738 scope.go:117] "RemoveContainer" containerID="8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.665067 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.670262 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k7wrp"] Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.704234 4738 scope.go:117] "RemoveContainer" containerID="94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.725381 4738 scope.go:117] "RemoveContainer" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.725795 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c\": container with ID starting with 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c not found: ID does not exist" containerID="69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.725836 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c"} err="failed to get container status \"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c\": rpc error: code = NotFound desc = could not find container \"69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c\": container with ID starting with 69ce6dd367a3be70564e0b049900778640f47912575916b22b0bcf212a41195c not found: ID does not exist" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.725862 4738 scope.go:117] "RemoveContainer" containerID="8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.726176 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c\": container with ID starting with 8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c not found: ID does not exist" containerID="8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.726256 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c"} err="failed to get container status \"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c\": rpc error: code = NotFound desc = could not find container \"8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c\": container with ID starting with 8c81a0bf7d55f22e69f6d936087a0c67b8a26f25747ba93c00aea2274a07c20c not found: ID does not exist" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.726322 4738 scope.go:117] "RemoveContainer" containerID="94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5" Mar 07 07:39:03 crc kubenswrapper[4738]: E0307 07:39:03.726606 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5\": container with ID starting with 94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5 not found: ID does not exist" containerID="94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5" Mar 07 07:39:03 crc kubenswrapper[4738]: I0307 07:39:03.726630 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5"} err="failed to get container status \"94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5\": rpc error: code = NotFound desc = could not find container \"94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5\": container with ID starting with 94920fa204cc71ff387779a830977b692db29e09ebcb4ea611908a22b4e968e5 not found: ID does not exist" Mar 07 07:39:04 crc kubenswrapper[4738]: I0307 07:39:04.397109 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e72c18f-8de1-4466-9614-d2b6b27749a3" path="/var/lib/kubelet/pods/8e72c18f-8de1-4466-9614-d2b6b27749a3/volumes" Mar 07 07:39:04 crc kubenswrapper[4738]: I0307 07:39:04.639888 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" event={"ID":"afb48577-3d58-4f53-abe5-8f5cb8deb7c4","Type":"ContainerStarted","Data":"9521ab12d63a43a24ce0f7cb4ab290e6a2e30dd38787c73e47bfbadfd4482144"} Mar 07 07:39:04 crc kubenswrapper[4738]: I0307 07:39:04.640205 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" event={"ID":"afb48577-3d58-4f53-abe5-8f5cb8deb7c4","Type":"ContainerStarted","Data":"d50df3845ef4b9a5830f5e7f185430790e08a4ea3f12b3a51daba4c6e08947e2"} Mar 07 07:39:04 crc kubenswrapper[4738]: I0307 07:39:04.668962 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" podStartSLOduration=1.668941403 podStartE2EDuration="1.668941403s" podCreationTimestamp="2026-03-07 07:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:04.662994434 +0000 UTC m=+2363.127981765" watchObservedRunningTime="2026-03-07 07:39:04.668941403 +0000 UTC m=+2363.133928724" Mar 07 07:39:05 crc kubenswrapper[4738]: I0307 07:39:05.666205 4738 generic.go:334] "Generic (PLEG): container finished" podID="afb48577-3d58-4f53-abe5-8f5cb8deb7c4" containerID="9521ab12d63a43a24ce0f7cb4ab290e6a2e30dd38787c73e47bfbadfd4482144" exitCode=0 Mar 07 07:39:05 crc kubenswrapper[4738]: I0307 07:39:05.666571 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" event={"ID":"afb48577-3d58-4f53-abe5-8f5cb8deb7c4","Type":"ContainerDied","Data":"9521ab12d63a43a24ce0f7cb4ab290e6a2e30dd38787c73e47bfbadfd4482144"} Mar 07 07:39:06 crc kubenswrapper[4738]: I0307 07:39:06.986369 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.021976 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn"] Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.031859 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn"] Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048636 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048749 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tthr\" (UniqueName: \"kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048803 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048844 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048885 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.048954 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf\") pod \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\" (UID: \"afb48577-3d58-4f53-abe5-8f5cb8deb7c4\") " Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.055443 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.056304 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.070787 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr" (OuterVolumeSpecName: "kube-api-access-8tthr") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "kube-api-access-8tthr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.072122 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts" (OuterVolumeSpecName: "scripts") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.078960 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.082239 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "afb48577-3d58-4f53-abe5-8f5cb8deb7c4" (UID: "afb48577-3d58-4f53-abe5-8f5cb8deb7c4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151142 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151359 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151414 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151463 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151544 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tthr\" (UniqueName: \"kubernetes.io/projected/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-kube-api-access-8tthr\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.151601 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/afb48577-3d58-4f53-abe5-8f5cb8deb7c4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.685482 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d50df3845ef4b9a5830f5e7f185430790e08a4ea3f12b3a51daba4c6e08947e2" Mar 07 07:39:07 crc kubenswrapper[4738]: I0307 07:39:07.685552 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jg4qn" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.220266 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g"] Mar 07 07:39:08 crc kubenswrapper[4738]: E0307 07:39:08.220601 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb48577-3d58-4f53-abe5-8f5cb8deb7c4" containerName="swift-ring-rebalance" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.220616 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb48577-3d58-4f53-abe5-8f5cb8deb7c4" containerName="swift-ring-rebalance" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.220782 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb48577-3d58-4f53-abe5-8f5cb8deb7c4" containerName="swift-ring-rebalance" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.221341 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.224377 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.227440 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.235984 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g"] Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.372523 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.372657 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.372730 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.372975 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.373042 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lclp6\" (UniqueName: \"kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.373106 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.400487 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb48577-3d58-4f53-abe5-8f5cb8deb7c4" path="/var/lib/kubelet/pods/afb48577-3d58-4f53-abe5-8f5cb8deb7c4/volumes" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474193 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474322 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474390 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474431 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474612 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474646 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lclp6\" (UniqueName: \"kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.474998 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.475526 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.475807 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.479975 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.480438 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.496568 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lclp6\" (UniqueName: \"kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6\") pod \"swift-ring-rebalance-debug-ccs2g\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.537858 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:08 crc kubenswrapper[4738]: I0307 07:39:08.971052 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g"] Mar 07 07:39:09 crc kubenswrapper[4738]: I0307 07:39:09.709675 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" event={"ID":"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd","Type":"ContainerStarted","Data":"59d22006bb52dfd8e5b6c0c6a5edaba794fa0e48f9c3b787979dbb0e9c3ba93c"} Mar 07 07:39:09 crc kubenswrapper[4738]: I0307 07:39:09.711550 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" event={"ID":"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd","Type":"ContainerStarted","Data":"f083b342c56405311bb416bdfdc5c67c80cfa7ee2e9204c3190745b9031215dd"} Mar 07 07:39:09 crc kubenswrapper[4738]: I0307 07:39:09.739727 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" podStartSLOduration=1.739711666 podStartE2EDuration="1.739711666s" podCreationTimestamp="2026-03-07 07:39:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:09.731186568 +0000 UTC m=+2368.196173909" watchObservedRunningTime="2026-03-07 07:39:09.739711666 +0000 UTC m=+2368.204698977" Mar 07 07:39:10 crc kubenswrapper[4738]: I0307 07:39:10.721427 4738 generic.go:334] "Generic (PLEG): container finished" podID="f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" containerID="59d22006bb52dfd8e5b6c0c6a5edaba794fa0e48f9c3b787979dbb0e9c3ba93c" exitCode=0 Mar 07 07:39:10 crc kubenswrapper[4738]: I0307 07:39:10.721470 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" event={"ID":"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd","Type":"ContainerDied","Data":"59d22006bb52dfd8e5b6c0c6a5edaba794fa0e48f9c3b787979dbb0e9c3ba93c"} Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.070626 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.115847 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g"] Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.123104 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g"] Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128141 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128246 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128339 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128374 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128420 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lclp6\" (UniqueName: \"kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.128489 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts\") pod \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\" (UID: \"f2ab2001-17f4-4295-83a5-2d1b2a9ecebd\") " Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.129102 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.129118 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.129423 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.129451 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.134294 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6" (OuterVolumeSpecName: "kube-api-access-lclp6") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "kube-api-access-lclp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.150733 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts" (OuterVolumeSpecName: "scripts") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.156794 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.159641 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" (UID: "f2ab2001-17f4-4295-83a5-2d1b2a9ecebd"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.230959 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.230995 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.231008 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.231017 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lclp6\" (UniqueName: \"kubernetes.io/projected/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd-kube-api-access-lclp6\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.397681 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" path="/var/lib/kubelet/pods/f2ab2001-17f4-4295-83a5-2d1b2a9ecebd/volumes" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.749536 4738 scope.go:117] "RemoveContainer" containerID="59d22006bb52dfd8e5b6c0c6a5edaba794fa0e48f9c3b787979dbb0e9c3ba93c" Mar 07 07:39:12 crc kubenswrapper[4738]: I0307 07:39:12.749643 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ccs2g" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.315463 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l"] Mar 07 07:39:13 crc kubenswrapper[4738]: E0307 07:39:13.316376 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" containerName="swift-ring-rebalance" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.316448 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" containerName="swift-ring-rebalance" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.316638 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2ab2001-17f4-4295-83a5-2d1b2a9ecebd" containerName="swift-ring-rebalance" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.317187 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.319467 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.319646 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.335736 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l"] Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451118 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451239 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451299 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451345 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xts8\" (UniqueName: \"kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451389 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.451427 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552574 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552665 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552706 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552754 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xts8\" (UniqueName: \"kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552788 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.552820 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.553366 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.553914 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.554316 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.561646 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.563933 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.571964 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xts8\" (UniqueName: \"kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8\") pod \"swift-ring-rebalance-debug-zkk9l\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.641082 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:13 crc kubenswrapper[4738]: I0307 07:39:13.916990 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l"] Mar 07 07:39:14 crc kubenswrapper[4738]: I0307 07:39:14.810620 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" event={"ID":"a7a9b782-885a-470c-9c98-c647c8c1f4a4","Type":"ContainerStarted","Data":"765e90488d1127368807c1f5d5a90aa5f37788f6f8bcd4327d556eb6b87acdc8"} Mar 07 07:39:14 crc kubenswrapper[4738]: I0307 07:39:14.810971 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" event={"ID":"a7a9b782-885a-470c-9c98-c647c8c1f4a4","Type":"ContainerStarted","Data":"51063220668edeed0fff61604650538621ff8272a827b87b3318adb0cf7d1681"} Mar 07 07:39:14 crc kubenswrapper[4738]: I0307 07:39:14.841380 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" podStartSLOduration=1.8413585829999999 podStartE2EDuration="1.841358583s" podCreationTimestamp="2026-03-07 07:39:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:14.836196455 +0000 UTC m=+2373.301183786" watchObservedRunningTime="2026-03-07 07:39:14.841358583 +0000 UTC m=+2373.306345914" Mar 07 07:39:15 crc kubenswrapper[4738]: I0307 07:39:15.833890 4738 generic.go:334] "Generic (PLEG): container finished" podID="a7a9b782-885a-470c-9c98-c647c8c1f4a4" containerID="765e90488d1127368807c1f5d5a90aa5f37788f6f8bcd4327d556eb6b87acdc8" exitCode=0 Mar 07 07:39:15 crc kubenswrapper[4738]: I0307 07:39:15.833950 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" event={"ID":"a7a9b782-885a-470c-9c98-c647c8c1f4a4","Type":"ContainerDied","Data":"765e90488d1127368807c1f5d5a90aa5f37788f6f8bcd4327d556eb6b87acdc8"} Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.146749 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.183023 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l"] Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.191737 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l"] Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.204851 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.205862 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.205996 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.206027 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xts8\" (UniqueName: \"kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.206095 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.206886 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.206876 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.207003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf\") pod \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\" (UID: \"a7a9b782-885a-470c-9c98-c647c8c1f4a4\") " Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.207389 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a7a9b782-885a-470c-9c98-c647c8c1f4a4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.207406 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.211726 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8" (OuterVolumeSpecName: "kube-api-access-6xts8") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "kube-api-access-6xts8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.226687 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts" (OuterVolumeSpecName: "scripts") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.233481 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.234336 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a7a9b782-885a-470c-9c98-c647c8c1f4a4" (UID: "a7a9b782-885a-470c-9c98-c647c8c1f4a4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.308596 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.308629 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xts8\" (UniqueName: \"kubernetes.io/projected/a7a9b782-885a-470c-9c98-c647c8c1f4a4-kube-api-access-6xts8\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.308643 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a7a9b782-885a-470c-9c98-c647c8c1f4a4-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.308654 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a7a9b782-885a-470c-9c98-c647c8c1f4a4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.862520 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51063220668edeed0fff61604650538621ff8272a827b87b3318adb0cf7d1681" Mar 07 07:39:17 crc kubenswrapper[4738]: I0307 07:39:17.862625 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zkk9l" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.395366 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a9b782-885a-470c-9c98-c647c8c1f4a4" path="/var/lib/kubelet/pods/a7a9b782-885a-470c-9c98-c647c8c1f4a4/volumes" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.395803 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws"] Mar 07 07:39:18 crc kubenswrapper[4738]: E0307 07:39:18.396032 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a9b782-885a-470c-9c98-c647c8c1f4a4" containerName="swift-ring-rebalance" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.396048 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a9b782-885a-470c-9c98-c647c8c1f4a4" containerName="swift-ring-rebalance" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.396204 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a9b782-885a-470c-9c98-c647c8c1f4a4" containerName="swift-ring-rebalance" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.396619 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws"] Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.396713 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.398978 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.399437 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.527606 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.527690 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.527892 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.527940 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.527995 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.528065 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9vb6\" (UniqueName: \"kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629200 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629256 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629274 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629300 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629330 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9vb6\" (UniqueName: \"kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.629397 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.630028 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.630422 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.630464 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.635269 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.644698 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.649548 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9vb6\" (UniqueName: \"kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6\") pod \"swift-ring-rebalance-debug-zl8ws\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:18 crc kubenswrapper[4738]: I0307 07:39:18.727215 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:19 crc kubenswrapper[4738]: W0307 07:39:19.214770 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d7da53_5bc6_4b88_ba56_954e0c956106.slice/crio-adbc4881ac6ca0d0ef19b8116d93737f13bbf7fda8ac6a22892856c021f119d8 WatchSource:0}: Error finding container adbc4881ac6ca0d0ef19b8116d93737f13bbf7fda8ac6a22892856c021f119d8: Status 404 returned error can't find the container with id adbc4881ac6ca0d0ef19b8116d93737f13bbf7fda8ac6a22892856c021f119d8 Mar 07 07:39:19 crc kubenswrapper[4738]: I0307 07:39:19.217872 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws"] Mar 07 07:39:19 crc kubenswrapper[4738]: I0307 07:39:19.880454 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" event={"ID":"d3d7da53-5bc6-4b88-ba56-954e0c956106","Type":"ContainerStarted","Data":"6fb734fff15727b7a529f28611fc3b8b353a06103feb2d5719d16f750aeba18f"} Mar 07 07:39:19 crc kubenswrapper[4738]: I0307 07:39:19.880820 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" event={"ID":"d3d7da53-5bc6-4b88-ba56-954e0c956106","Type":"ContainerStarted","Data":"adbc4881ac6ca0d0ef19b8116d93737f13bbf7fda8ac6a22892856c021f119d8"} Mar 07 07:39:19 crc kubenswrapper[4738]: I0307 07:39:19.896215 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" podStartSLOduration=1.896196172 podStartE2EDuration="1.896196172s" podCreationTimestamp="2026-03-07 07:39:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:19.893041138 +0000 UTC m=+2378.358028459" watchObservedRunningTime="2026-03-07 07:39:19.896196172 +0000 UTC m=+2378.361183493" Mar 07 07:39:20 crc kubenswrapper[4738]: I0307 07:39:20.892290 4738 generic.go:334] "Generic (PLEG): container finished" podID="d3d7da53-5bc6-4b88-ba56-954e0c956106" containerID="6fb734fff15727b7a529f28611fc3b8b353a06103feb2d5719d16f750aeba18f" exitCode=0 Mar 07 07:39:20 crc kubenswrapper[4738]: I0307 07:39:20.892483 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" event={"ID":"d3d7da53-5bc6-4b88-ba56-954e0c956106","Type":"ContainerDied","Data":"6fb734fff15727b7a529f28611fc3b8b353a06103feb2d5719d16f750aeba18f"} Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.187061 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.232653 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws"] Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.240585 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws"] Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.282803 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.282858 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.282910 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.282959 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.283007 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9vb6\" (UniqueName: \"kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.283050 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf\") pod \"d3d7da53-5bc6-4b88-ba56-954e0c956106\" (UID: \"d3d7da53-5bc6-4b88-ba56-954e0c956106\") " Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.283773 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.284076 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/d3d7da53-5bc6-4b88-ba56-954e0c956106-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.284249 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.292425 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6" (OuterVolumeSpecName: "kube-api-access-b9vb6") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "kube-api-access-b9vb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.305252 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.307677 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.310889 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts" (OuterVolumeSpecName: "scripts") pod "d3d7da53-5bc6-4b88-ba56-954e0c956106" (UID: "d3d7da53-5bc6-4b88-ba56-954e0c956106"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.385611 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.385651 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/d3d7da53-5bc6-4b88-ba56-954e0c956106-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.385669 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.385677 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/d3d7da53-5bc6-4b88-ba56-954e0c956106-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.385686 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9vb6\" (UniqueName: \"kubernetes.io/projected/d3d7da53-5bc6-4b88-ba56-954e0c956106-kube-api-access-b9vb6\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.394361 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d7da53-5bc6-4b88-ba56-954e0c956106" path="/var/lib/kubelet/pods/d3d7da53-5bc6-4b88-ba56-954e0c956106/volumes" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.914815 4738 scope.go:117] "RemoveContainer" containerID="6fb734fff15727b7a529f28611fc3b8b353a06103feb2d5719d16f750aeba18f" Mar 07 07:39:22 crc kubenswrapper[4738]: I0307 07:39:22.914891 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zl8ws" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.458599 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mczfq"] Mar 07 07:39:23 crc kubenswrapper[4738]: E0307 07:39:23.459212 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d7da53-5bc6-4b88-ba56-954e0c956106" containerName="swift-ring-rebalance" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.459227 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d7da53-5bc6-4b88-ba56-954e0c956106" containerName="swift-ring-rebalance" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.459389 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d7da53-5bc6-4b88-ba56-954e0c956106" containerName="swift-ring-rebalance" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.459826 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.462999 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.463952 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.487590 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mczfq"] Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526086 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgwwj\" (UniqueName: \"kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526192 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526254 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526313 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526620 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.526765 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627793 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627840 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627878 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kgwwj\" (UniqueName: \"kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627903 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627928 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.627956 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.628690 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.629051 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.629056 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.635934 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.637796 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.643370 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgwwj\" (UniqueName: \"kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj\") pod \"swift-ring-rebalance-debug-mczfq\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:23 crc kubenswrapper[4738]: I0307 07:39:23.792194 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:24 crc kubenswrapper[4738]: I0307 07:39:24.223941 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mczfq"] Mar 07 07:39:24 crc kubenswrapper[4738]: W0307 07:39:24.231395 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod60bf6545_45f4_460f_a331_0eeb32ff80ee.slice/crio-ba559baec078f0b19eaec7202e565d572524cb4f8cf292ac1a566c952639252c WatchSource:0}: Error finding container ba559baec078f0b19eaec7202e565d572524cb4f8cf292ac1a566c952639252c: Status 404 returned error can't find the container with id ba559baec078f0b19eaec7202e565d572524cb4f8cf292ac1a566c952639252c Mar 07 07:39:24 crc kubenswrapper[4738]: I0307 07:39:24.936800 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" event={"ID":"60bf6545-45f4-460f-a331-0eeb32ff80ee","Type":"ContainerStarted","Data":"fa20a124747707551ac51b7142e65d808067cf5bbd2163af96dc582826363c37"} Mar 07 07:39:24 crc kubenswrapper[4738]: I0307 07:39:24.936851 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" event={"ID":"60bf6545-45f4-460f-a331-0eeb32ff80ee","Type":"ContainerStarted","Data":"ba559baec078f0b19eaec7202e565d572524cb4f8cf292ac1a566c952639252c"} Mar 07 07:39:24 crc kubenswrapper[4738]: I0307 07:39:24.966431 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" podStartSLOduration=1.966399691 podStartE2EDuration="1.966399691s" podCreationTimestamp="2026-03-07 07:39:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:24.958258954 +0000 UTC m=+2383.423246295" watchObservedRunningTime="2026-03-07 07:39:24.966399691 +0000 UTC m=+2383.431387052" Mar 07 07:39:25 crc kubenswrapper[4738]: I0307 07:39:25.948850 4738 generic.go:334] "Generic (PLEG): container finished" podID="60bf6545-45f4-460f-a331-0eeb32ff80ee" containerID="fa20a124747707551ac51b7142e65d808067cf5bbd2163af96dc582826363c37" exitCode=0 Mar 07 07:39:25 crc kubenswrapper[4738]: I0307 07:39:25.948915 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" event={"ID":"60bf6545-45f4-460f-a331-0eeb32ff80ee","Type":"ContainerDied","Data":"fa20a124747707551ac51b7142e65d808067cf5bbd2163af96dc582826363c37"} Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.256513 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.298132 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mczfq"] Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.312600 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mczfq"] Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386451 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386536 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgwwj\" (UniqueName: \"kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386600 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386627 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.386648 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf\") pod \"60bf6545-45f4-460f-a331-0eeb32ff80ee\" (UID: \"60bf6545-45f4-460f-a331-0eeb32ff80ee\") " Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.387288 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.387658 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.391709 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj" (OuterVolumeSpecName: "kube-api-access-kgwwj") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "kube-api-access-kgwwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.406090 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts" (OuterVolumeSpecName: "scripts") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.410220 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.431060 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "60bf6545-45f4-460f-a331-0eeb32ff80ee" (UID: "60bf6545-45f4-460f-a331-0eeb32ff80ee"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488262 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/60bf6545-45f4-460f-a331-0eeb32ff80ee-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488293 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488304 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kgwwj\" (UniqueName: \"kubernetes.io/projected/60bf6545-45f4-460f-a331-0eeb32ff80ee-kube-api-access-kgwwj\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488317 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488325 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/60bf6545-45f4-460f-a331-0eeb32ff80ee-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.488335 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/60bf6545-45f4-460f-a331-0eeb32ff80ee-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.969971 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba559baec078f0b19eaec7202e565d572524cb4f8cf292ac1a566c952639252c" Mar 07 07:39:27 crc kubenswrapper[4738]: I0307 07:39:27.970082 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mczfq" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.396022 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60bf6545-45f4-460f-a331-0eeb32ff80ee" path="/var/lib/kubelet/pods/60bf6545-45f4-460f-a331-0eeb32ff80ee/volumes" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.452458 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-98jk9"] Mar 07 07:39:28 crc kubenswrapper[4738]: E0307 07:39:28.452886 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60bf6545-45f4-460f-a331-0eeb32ff80ee" containerName="swift-ring-rebalance" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.452902 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="60bf6545-45f4-460f-a331-0eeb32ff80ee" containerName="swift-ring-rebalance" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.453148 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="60bf6545-45f4-460f-a331-0eeb32ff80ee" containerName="swift-ring-rebalance" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.453829 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.456077 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.456082 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.463145 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-98jk9"] Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.503703 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.504011 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.504122 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrjzz\" (UniqueName: \"kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.504266 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.504366 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.504474 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606089 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606180 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606225 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrjzz\" (UniqueName: \"kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606271 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606285 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606313 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.606770 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.607269 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.607456 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.611939 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.612421 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.623442 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrjzz\" (UniqueName: \"kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz\") pod \"swift-ring-rebalance-debug-98jk9\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:28 crc kubenswrapper[4738]: I0307 07:39:28.777530 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:29 crc kubenswrapper[4738]: I0307 07:39:29.267676 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-98jk9"] Mar 07 07:39:29 crc kubenswrapper[4738]: I0307 07:39:29.994077 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" event={"ID":"5c85cf60-3529-4772-a321-73fb08926e3f","Type":"ContainerStarted","Data":"7521e3e6d4c22efe37d1fb931b5041f97959d49153a5c334abbc750e3d5e5d23"} Mar 07 07:39:29 crc kubenswrapper[4738]: I0307 07:39:29.994436 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" event={"ID":"5c85cf60-3529-4772-a321-73fb08926e3f","Type":"ContainerStarted","Data":"89b2899eca251f4504a8f2e44ab65c8bc0eb04a6d08fb5609fc135345635eb56"} Mar 07 07:39:32 crc kubenswrapper[4738]: I0307 07:39:32.016109 4738 generic.go:334] "Generic (PLEG): container finished" podID="5c85cf60-3529-4772-a321-73fb08926e3f" containerID="7521e3e6d4c22efe37d1fb931b5041f97959d49153a5c334abbc750e3d5e5d23" exitCode=0 Mar 07 07:39:32 crc kubenswrapper[4738]: I0307 07:39:32.016173 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" event={"ID":"5c85cf60-3529-4772-a321-73fb08926e3f","Type":"ContainerDied","Data":"7521e3e6d4c22efe37d1fb931b5041f97959d49153a5c334abbc750e3d5e5d23"} Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.357547 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.399807 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-98jk9"] Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.405182 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-98jk9"] Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505632 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505727 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505784 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505820 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrjzz\" (UniqueName: \"kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505890 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.505991 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices\") pod \"5c85cf60-3529-4772-a321-73fb08926e3f\" (UID: \"5c85cf60-3529-4772-a321-73fb08926e3f\") " Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.506892 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.507110 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.510974 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz" (OuterVolumeSpecName: "kube-api-access-nrjzz") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "kube-api-access-nrjzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.531205 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.542954 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts" (OuterVolumeSpecName: "scripts") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.573842 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5c85cf60-3529-4772-a321-73fb08926e3f" (UID: "5c85cf60-3529-4772-a321-73fb08926e3f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607493 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607653 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607747 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5c85cf60-3529-4772-a321-73fb08926e3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607823 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5c85cf60-3529-4772-a321-73fb08926e3f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607892 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrjzz\" (UniqueName: \"kubernetes.io/projected/5c85cf60-3529-4772-a321-73fb08926e3f-kube-api-access-nrjzz\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:33 crc kubenswrapper[4738]: I0307 07:39:33.607998 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5c85cf60-3529-4772-a321-73fb08926e3f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.040989 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="89b2899eca251f4504a8f2e44ab65c8bc0eb04a6d08fb5609fc135345635eb56" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.041309 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-98jk9" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.396556 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c85cf60-3529-4772-a321-73fb08926e3f" path="/var/lib/kubelet/pods/5c85cf60-3529-4772-a321-73fb08926e3f/volumes" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.540620 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj"] Mar 07 07:39:34 crc kubenswrapper[4738]: E0307 07:39:34.541088 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c85cf60-3529-4772-a321-73fb08926e3f" containerName="swift-ring-rebalance" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.541115 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c85cf60-3529-4772-a321-73fb08926e3f" containerName="swift-ring-rebalance" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.541408 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c85cf60-3529-4772-a321-73fb08926e3f" containerName="swift-ring-rebalance" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.542120 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.547074 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.547431 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.551044 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj"] Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622742 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622796 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622870 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622922 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622944 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.622966 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5r59\" (UniqueName: \"kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.724824 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.724905 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.724921 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.724942 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5r59\" (UniqueName: \"kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.724993 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.725472 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.725556 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.725852 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.726030 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.730544 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.733742 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.746427 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5r59\" (UniqueName: \"kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59\") pod \"swift-ring-rebalance-debug-8mjdj\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:34 crc kubenswrapper[4738]: I0307 07:39:34.875280 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:35 crc kubenswrapper[4738]: I0307 07:39:35.321644 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj"] Mar 07 07:39:36 crc kubenswrapper[4738]: I0307 07:39:36.064859 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" event={"ID":"78781e05-d35d-426a-b38f-5e3f7cbc716c","Type":"ContainerStarted","Data":"0dec7506154bca4e104a462cf78c6947214246c96a1259917c9874822328c63f"} Mar 07 07:39:36 crc kubenswrapper[4738]: I0307 07:39:36.065195 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" event={"ID":"78781e05-d35d-426a-b38f-5e3f7cbc716c","Type":"ContainerStarted","Data":"28cbc17e95278ec41b474ad10d403975bccdf478ad3d775b735df22c8630afc5"} Mar 07 07:39:36 crc kubenswrapper[4738]: I0307 07:39:36.099317 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" podStartSLOduration=2.099297868 podStartE2EDuration="2.099297868s" podCreationTimestamp="2026-03-07 07:39:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:36.09565752 +0000 UTC m=+2394.560644841" watchObservedRunningTime="2026-03-07 07:39:36.099297868 +0000 UTC m=+2394.564285189" Mar 07 07:39:37 crc kubenswrapper[4738]: I0307 07:39:37.078270 4738 generic.go:334] "Generic (PLEG): container finished" podID="78781e05-d35d-426a-b38f-5e3f7cbc716c" containerID="0dec7506154bca4e104a462cf78c6947214246c96a1259917c9874822328c63f" exitCode=0 Mar 07 07:39:37 crc kubenswrapper[4738]: I0307 07:39:37.078331 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" event={"ID":"78781e05-d35d-426a-b38f-5e3f7cbc716c","Type":"ContainerDied","Data":"0dec7506154bca4e104a462cf78c6947214246c96a1259917c9874822328c63f"} Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.388555 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.431999 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj"] Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.436656 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj"] Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485317 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485349 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485406 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485433 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5r59\" (UniqueName: \"kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485474 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.485502 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf\") pod \"78781e05-d35d-426a-b38f-5e3f7cbc716c\" (UID: \"78781e05-d35d-426a-b38f-5e3f7cbc716c\") " Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.486284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.486475 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.486897 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/78781e05-d35d-426a-b38f-5e3f7cbc716c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.486920 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.491257 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59" (OuterVolumeSpecName: "kube-api-access-p5r59") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "kube-api-access-p5r59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.504676 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts" (OuterVolumeSpecName: "scripts") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.505515 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.513390 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "78781e05-d35d-426a-b38f-5e3f7cbc716c" (UID: "78781e05-d35d-426a-b38f-5e3f7cbc716c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.588393 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/78781e05-d35d-426a-b38f-5e3f7cbc716c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.588427 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5r59\" (UniqueName: \"kubernetes.io/projected/78781e05-d35d-426a-b38f-5e3f7cbc716c-kube-api-access-p5r59\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.588440 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:38 crc kubenswrapper[4738]: I0307 07:39:38.588450 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/78781e05-d35d-426a-b38f-5e3f7cbc716c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.100085 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28cbc17e95278ec41b474ad10d403975bccdf478ad3d775b735df22c8630afc5" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.100242 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-8mjdj" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.580309 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-r7j29"] Mar 07 07:39:39 crc kubenswrapper[4738]: E0307 07:39:39.582007 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78781e05-d35d-426a-b38f-5e3f7cbc716c" containerName="swift-ring-rebalance" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.582207 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="78781e05-d35d-426a-b38f-5e3f7cbc716c" containerName="swift-ring-rebalance" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.582671 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="78781e05-d35d-426a-b38f-5e3f7cbc716c" containerName="swift-ring-rebalance" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.583552 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.587224 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.587612 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.594898 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-r7j29"] Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707380 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707435 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707476 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6lkd\" (UniqueName: \"kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707499 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707534 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.707658 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.809892 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.809970 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.810066 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6lkd\" (UniqueName: \"kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.810130 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.810242 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.810795 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.810887 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.811064 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.811397 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.816591 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.818060 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.846723 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6lkd\" (UniqueName: \"kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd\") pod \"swift-ring-rebalance-debug-r7j29\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:39 crc kubenswrapper[4738]: I0307 07:39:39.906749 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:40 crc kubenswrapper[4738]: I0307 07:39:40.381287 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-r7j29"] Mar 07 07:39:40 crc kubenswrapper[4738]: I0307 07:39:40.415506 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78781e05-d35d-426a-b38f-5e3f7cbc716c" path="/var/lib/kubelet/pods/78781e05-d35d-426a-b38f-5e3f7cbc716c/volumes" Mar 07 07:39:41 crc kubenswrapper[4738]: I0307 07:39:41.124524 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" event={"ID":"a153082f-736e-4e42-8a37-fe87a3f95faa","Type":"ContainerStarted","Data":"a273d92a6ebc4a0727ca734e4c4843ac899106309db60192f8077a59a167418c"} Mar 07 07:39:41 crc kubenswrapper[4738]: I0307 07:39:41.124895 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" event={"ID":"a153082f-736e-4e42-8a37-fe87a3f95faa","Type":"ContainerStarted","Data":"9d5d9a7049289ec15734a62ff31bc530361f5b62bc589d1a350937322b6814a1"} Mar 07 07:39:41 crc kubenswrapper[4738]: I0307 07:39:41.148187 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" podStartSLOduration=2.148142455 podStartE2EDuration="2.148142455s" podCreationTimestamp="2026-03-07 07:39:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:41.143183122 +0000 UTC m=+2399.608170483" watchObservedRunningTime="2026-03-07 07:39:41.148142455 +0000 UTC m=+2399.613129786" Mar 07 07:39:42 crc kubenswrapper[4738]: I0307 07:39:42.137010 4738 generic.go:334] "Generic (PLEG): container finished" podID="a153082f-736e-4e42-8a37-fe87a3f95faa" containerID="a273d92a6ebc4a0727ca734e4c4843ac899106309db60192f8077a59a167418c" exitCode=0 Mar 07 07:39:42 crc kubenswrapper[4738]: I0307 07:39:42.137061 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" event={"ID":"a153082f-736e-4e42-8a37-fe87a3f95faa","Type":"ContainerDied","Data":"a273d92a6ebc4a0727ca734e4c4843ac899106309db60192f8077a59a167418c"} Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.491595 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.525306 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-r7j29"] Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.535797 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-r7j29"] Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6lkd\" (UniqueName: \"kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578084 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578136 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578229 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578472 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.578497 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf\") pod \"a153082f-736e-4e42-8a37-fe87a3f95faa\" (UID: \"a153082f-736e-4e42-8a37-fe87a3f95faa\") " Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.579123 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.579764 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.584817 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd" (OuterVolumeSpecName: "kube-api-access-z6lkd") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "kube-api-access-z6lkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.604711 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts" (OuterVolumeSpecName: "scripts") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.610095 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.610531 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a153082f-736e-4e42-8a37-fe87a3f95faa" (UID: "a153082f-736e-4e42-8a37-fe87a3f95faa"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680139 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680204 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a153082f-736e-4e42-8a37-fe87a3f95faa-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680218 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680229 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6lkd\" (UniqueName: \"kubernetes.io/projected/a153082f-736e-4e42-8a37-fe87a3f95faa-kube-api-access-z6lkd\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680243 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a153082f-736e-4e42-8a37-fe87a3f95faa-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:43 crc kubenswrapper[4738]: I0307 07:39:43.680252 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a153082f-736e-4e42-8a37-fe87a3f95faa-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:44 crc kubenswrapper[4738]: I0307 07:39:44.166716 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d5d9a7049289ec15734a62ff31bc530361f5b62bc589d1a350937322b6814a1" Mar 07 07:39:44 crc kubenswrapper[4738]: I0307 07:39:44.166793 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-r7j29" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.579767 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a153082f-736e-4e42-8a37-fe87a3f95faa" path="/var/lib/kubelet/pods/a153082f-736e-4e42-8a37-fe87a3f95faa/volumes" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.580811 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz"] Mar 07 07:39:46 crc kubenswrapper[4738]: E0307 07:39:46.581048 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a153082f-736e-4e42-8a37-fe87a3f95faa" containerName="swift-ring-rebalance" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.581062 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a153082f-736e-4e42-8a37-fe87a3f95faa" containerName="swift-ring-rebalance" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.581325 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a153082f-736e-4e42-8a37-fe87a3f95faa" containerName="swift-ring-rebalance" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.581878 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz"] Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.581966 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.586208 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.586672 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.724904 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.725100 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m2jhd\" (UniqueName: \"kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.725143 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.725440 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.725574 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.725611 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.826967 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827058 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m2jhd\" (UniqueName: \"kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827078 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827120 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827151 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827186 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.827638 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.828181 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.828314 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.833654 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.838420 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.843179 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m2jhd\" (UniqueName: \"kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd\") pod \"swift-ring-rebalance-debug-wf4zz\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:46 crc kubenswrapper[4738]: I0307 07:39:46.919042 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.111041 4738 scope.go:117] "RemoveContainer" containerID="e2a40bf195c6b9012453b43ad2e4fbe24b063ee0c7b348081e0cc5a07d7e7328" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.148801 4738 scope.go:117] "RemoveContainer" containerID="f98c9f011e414265be3c18216d73d08ce7cb52ab22eb8d26e6002da46eedb71a" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.192518 4738 scope.go:117] "RemoveContainer" containerID="be04b1a066ae7dd9303878b6bd39ac5173216d92b1ec98b92dbd4723b96918df" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.196488 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz"] Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.232484 4738 scope.go:117] "RemoveContainer" containerID="142b3c49962feb4ee438dbc912d98c32f7bb88c4b9c90a420590eaea712f8dd1" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.272817 4738 scope.go:117] "RemoveContainer" containerID="eb4b7b2b8467332219a46116917aa28deeeb097854357965abee4ac3b59d453b" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.302139 4738 scope.go:117] "RemoveContainer" containerID="f4020892bbabb1763f0f8b2ef48e9f8e6d2d664aa3f6b1e9363d7d637260bed3" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.332127 4738 scope.go:117] "RemoveContainer" containerID="c87ece0ce05d82a833aa45c5cd191768c2d0189a01b176dcf0074a82cca3bf50" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.357346 4738 scope.go:117] "RemoveContainer" containerID="b636b2bad93e52f58cdd67493996fa208c34e09f9a7b44d047365049ef19860b" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.390024 4738 scope.go:117] "RemoveContainer" containerID="a50bb74c16bc932859d2d2d1802bb162810bdf4adcd21ab01491aac26ca66b9f" Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.602147 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" event={"ID":"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5","Type":"ContainerStarted","Data":"9f80f2af8c4e7c35651be8becd660a0956c226418eb9ea9fad6d192a44467c1d"} Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.602222 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" event={"ID":"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5","Type":"ContainerStarted","Data":"c28da09d267937f7c14db53f8c50c7c250246fd7801802808ad77a13fd27479c"} Mar 07 07:39:47 crc kubenswrapper[4738]: I0307 07:39:47.627107 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" podStartSLOduration=1.627089964 podStartE2EDuration="1.627089964s" podCreationTimestamp="2026-03-07 07:39:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:47.616617425 +0000 UTC m=+2406.081604756" watchObservedRunningTime="2026-03-07 07:39:47.627089964 +0000 UTC m=+2406.092077285" Mar 07 07:39:49 crc kubenswrapper[4738]: I0307 07:39:49.632003 4738 generic.go:334] "Generic (PLEG): container finished" podID="ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" containerID="9f80f2af8c4e7c35651be8becd660a0956c226418eb9ea9fad6d192a44467c1d" exitCode=0 Mar 07 07:39:49 crc kubenswrapper[4738]: I0307 07:39:49.632136 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" event={"ID":"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5","Type":"ContainerDied","Data":"9f80f2af8c4e7c35651be8becd660a0956c226418eb9ea9fad6d192a44467c1d"} Mar 07 07:39:50 crc kubenswrapper[4738]: I0307 07:39:50.952768 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:50 crc kubenswrapper[4738]: I0307 07:39:50.989044 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz"] Mar 07 07:39:50 crc kubenswrapper[4738]: I0307 07:39:50.996224 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz"] Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106098 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106186 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m2jhd\" (UniqueName: \"kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106249 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106374 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106669 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106706 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts\") pod \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\" (UID: \"ace0dc47-4d9c-4fb8-8d39-70c161fafbd5\") " Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.106865 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.107454 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.107906 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.112575 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd" (OuterVolumeSpecName: "kube-api-access-m2jhd") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "kube-api-access-m2jhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.136772 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.144764 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.146448 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts" (OuterVolumeSpecName: "scripts") pod "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" (UID: "ace0dc47-4d9c-4fb8-8d39-70c161fafbd5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.210338 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m2jhd\" (UniqueName: \"kubernetes.io/projected/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-kube-api-access-m2jhd\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.210424 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.210493 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.210513 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.210531 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.657048 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c28da09d267937f7c14db53f8c50c7c250246fd7801802808ad77a13fd27479c" Mar 07 07:39:51 crc kubenswrapper[4738]: I0307 07:39:51.657143 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wf4zz" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.168747 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pdl68"] Mar 07 07:39:52 crc kubenswrapper[4738]: E0307 07:39:52.169032 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" containerName="swift-ring-rebalance" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.169044 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" containerName="swift-ring-rebalance" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.169257 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" containerName="swift-ring-rebalance" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.170644 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.173523 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.176809 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.185740 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pdl68"] Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.271247 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.272008 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.272233 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4z8rr\" (UniqueName: \"kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.272392 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.272530 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.272636 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379473 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379535 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379584 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379651 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379707 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4z8rr\" (UniqueName: \"kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.379735 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.380249 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.380624 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.380624 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.387863 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.389615 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.395028 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ace0dc47-4d9c-4fb8-8d39-70c161fafbd5" path="/var/lib/kubelet/pods/ace0dc47-4d9c-4fb8-8d39-70c161fafbd5/volumes" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.400773 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4z8rr\" (UniqueName: \"kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr\") pod \"swift-ring-rebalance-debug-pdl68\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.489431 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:52 crc kubenswrapper[4738]: I0307 07:39:52.961456 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pdl68"] Mar 07 07:39:53 crc kubenswrapper[4738]: I0307 07:39:53.674556 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" event={"ID":"ff0efbfc-2678-4dce-8278-f1dd35d50b6d","Type":"ContainerStarted","Data":"6710fee685b33eaf446d48b237c991f4ec5ce8018a71b5b2344a1f0b85cf8a97"} Mar 07 07:39:53 crc kubenswrapper[4738]: I0307 07:39:53.674893 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" event={"ID":"ff0efbfc-2678-4dce-8278-f1dd35d50b6d","Type":"ContainerStarted","Data":"db527220b3489cfe5980bcb90446302453dc938ad9e85b8e4edaaeec4bbd35b5"} Mar 07 07:39:53 crc kubenswrapper[4738]: I0307 07:39:53.695140 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" podStartSLOduration=1.6951123479999999 podStartE2EDuration="1.695112348s" podCreationTimestamp="2026-03-07 07:39:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:53.69147761 +0000 UTC m=+2412.156464931" watchObservedRunningTime="2026-03-07 07:39:53.695112348 +0000 UTC m=+2412.160099669" Mar 07 07:39:54 crc kubenswrapper[4738]: I0307 07:39:54.685815 4738 generic.go:334] "Generic (PLEG): container finished" podID="ff0efbfc-2678-4dce-8278-f1dd35d50b6d" containerID="6710fee685b33eaf446d48b237c991f4ec5ce8018a71b5b2344a1f0b85cf8a97" exitCode=0 Mar 07 07:39:54 crc kubenswrapper[4738]: I0307 07:39:54.685881 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" event={"ID":"ff0efbfc-2678-4dce-8278-f1dd35d50b6d","Type":"ContainerDied","Data":"6710fee685b33eaf446d48b237c991f4ec5ce8018a71b5b2344a1f0b85cf8a97"} Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.026110 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042000 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042173 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042232 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042273 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042320 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4z8rr\" (UniqueName: \"kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042385 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf\") pod \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\" (UID: \"ff0efbfc-2678-4dce-8278-f1dd35d50b6d\") " Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.042859 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.043612 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.047824 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr" (OuterVolumeSpecName: "kube-api-access-4z8rr") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "kube-api-access-4z8rr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.064212 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pdl68"] Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.075625 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts" (OuterVolumeSpecName: "scripts") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.077291 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.078652 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "ff0efbfc-2678-4dce-8278-f1dd35d50b6d" (UID: "ff0efbfc-2678-4dce-8278-f1dd35d50b6d"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.080453 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-pdl68"] Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144077 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144465 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144481 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4z8rr\" (UniqueName: \"kubernetes.io/projected/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-kube-api-access-4z8rr\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144492 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144508 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.144522 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/ff0efbfc-2678-4dce-8278-f1dd35d50b6d-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.402123 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff0efbfc-2678-4dce-8278-f1dd35d50b6d" path="/var/lib/kubelet/pods/ff0efbfc-2678-4dce-8278-f1dd35d50b6d/volumes" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.703938 4738 scope.go:117] "RemoveContainer" containerID="6710fee685b33eaf446d48b237c991f4ec5ce8018a71b5b2344a1f0b85cf8a97" Mar 07 07:39:56 crc kubenswrapper[4738]: I0307 07:39:56.703963 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-pdl68" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.260834 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl"] Mar 07 07:39:57 crc kubenswrapper[4738]: E0307 07:39:57.261863 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff0efbfc-2678-4dce-8278-f1dd35d50b6d" containerName="swift-ring-rebalance" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.261902 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff0efbfc-2678-4dce-8278-f1dd35d50b6d" containerName="swift-ring-rebalance" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.262303 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff0efbfc-2678-4dce-8278-f1dd35d50b6d" containerName="swift-ring-rebalance" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.263151 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.266213 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.267277 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.272604 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl"] Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.361636 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fwdj\" (UniqueName: \"kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.361699 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.361753 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.361829 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.361851 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.362052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464275 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464677 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464731 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464785 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464883 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fwdj\" (UniqueName: \"kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.464946 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.465984 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.466330 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.466628 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.478341 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.479085 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.492651 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fwdj\" (UniqueName: \"kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj\") pod \"swift-ring-rebalance-debug-mvhpl\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.587314 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:39:57 crc kubenswrapper[4738]: I0307 07:39:57.820607 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl"] Mar 07 07:39:58 crc kubenswrapper[4738]: I0307 07:39:58.732553 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" event={"ID":"1552df84-b0f3-4af6-b278-429267ed8d59","Type":"ContainerStarted","Data":"a8ded881e7942ac605e669272761fb6817e39ffc813409b452b44646ecff79b1"} Mar 07 07:39:58 crc kubenswrapper[4738]: I0307 07:39:58.732608 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" event={"ID":"1552df84-b0f3-4af6-b278-429267ed8d59","Type":"ContainerStarted","Data":"a8ab2b2519b39fc3eb5d0396856e87bb8331eea555e5dc440326b663e78e513e"} Mar 07 07:39:58 crc kubenswrapper[4738]: I0307 07:39:58.753400 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" podStartSLOduration=1.753378527 podStartE2EDuration="1.753378527s" podCreationTimestamp="2026-03-07 07:39:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:39:58.749245868 +0000 UTC m=+2417.214233219" watchObservedRunningTime="2026-03-07 07:39:58.753378527 +0000 UTC m=+2417.218365848" Mar 07 07:39:59 crc kubenswrapper[4738]: I0307 07:39:59.742718 4738 generic.go:334] "Generic (PLEG): container finished" podID="1552df84-b0f3-4af6-b278-429267ed8d59" containerID="a8ded881e7942ac605e669272761fb6817e39ffc813409b452b44646ecff79b1" exitCode=0 Mar 07 07:39:59 crc kubenswrapper[4738]: I0307 07:39:59.742827 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" event={"ID":"1552df84-b0f3-4af6-b278-429267ed8d59","Type":"ContainerDied","Data":"a8ded881e7942ac605e669272761fb6817e39ffc813409b452b44646ecff79b1"} Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.153517 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547820-788v7"] Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.154832 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.157860 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.158015 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.163898 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.172094 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-788v7"] Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.254760 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgpsr\" (UniqueName: \"kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr\") pod \"auto-csr-approver-29547820-788v7\" (UID: \"4f325caa-c71f-44ae-8312-2241b9b8bc67\") " pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.356273 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgpsr\" (UniqueName: \"kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr\") pod \"auto-csr-approver-29547820-788v7\" (UID: \"4f325caa-c71f-44ae-8312-2241b9b8bc67\") " pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.377999 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgpsr\" (UniqueName: \"kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr\") pod \"auto-csr-approver-29547820-788v7\" (UID: \"4f325caa-c71f-44ae-8312-2241b9b8bc67\") " pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.475438 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.880558 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-788v7"] Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.894428 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.961286 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971317 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971394 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971446 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971477 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971554 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.971633 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fwdj\" (UniqueName: \"kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj\") pod \"1552df84-b0f3-4af6-b278-429267ed8d59\" (UID: \"1552df84-b0f3-4af6-b278-429267ed8d59\") " Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.972403 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.972588 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.976692 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj" (OuterVolumeSpecName: "kube-api-access-8fwdj") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "kube-api-access-8fwdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.991973 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl"] Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.998547 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts" (OuterVolumeSpecName: "scripts") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.999218 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.999315 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl"] Mar 07 07:40:00 crc kubenswrapper[4738]: I0307 07:40:00.999600 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1552df84-b0f3-4af6-b278-429267ed8d59" (UID: "1552df84-b0f3-4af6-b278-429267ed8d59"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074258 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fwdj\" (UniqueName: \"kubernetes.io/projected/1552df84-b0f3-4af6-b278-429267ed8d59-kube-api-access-8fwdj\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074296 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1552df84-b0f3-4af6-b278-429267ed8d59-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074307 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074315 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074325 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1552df84-b0f3-4af6-b278-429267ed8d59-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.074334 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1552df84-b0f3-4af6-b278-429267ed8d59-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.772063 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-788v7" event={"ID":"4f325caa-c71f-44ae-8312-2241b9b8bc67","Type":"ContainerStarted","Data":"aa6003fce7f5fd21df0f0f55878548b3f165ca5782a22861e6cc7258a7bad735"} Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.776260 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8ab2b2519b39fc3eb5d0396856e87bb8331eea555e5dc440326b663e78e513e" Mar 07 07:40:01 crc kubenswrapper[4738]: I0307 07:40:01.776374 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mvhpl" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.144224 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt"] Mar 07 07:40:02 crc kubenswrapper[4738]: E0307 07:40:02.145263 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1552df84-b0f3-4af6-b278-429267ed8d59" containerName="swift-ring-rebalance" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.145281 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1552df84-b0f3-4af6-b278-429267ed8d59" containerName="swift-ring-rebalance" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.145479 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1552df84-b0f3-4af6-b278-429267ed8d59" containerName="swift-ring-rebalance" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.146278 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.148358 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.150351 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.158577 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt"] Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.188942 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mndgp\" (UniqueName: \"kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.189003 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.189056 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.189182 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.189229 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.189414 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290423 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290489 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mndgp\" (UniqueName: \"kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290525 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290562 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290607 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.290628 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.291255 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.291584 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.291712 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.299837 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.301529 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.313131 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mndgp\" (UniqueName: \"kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp\") pod \"swift-ring-rebalance-debug-ljmqt\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.397119 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1552df84-b0f3-4af6-b278-429267ed8d59" path="/var/lib/kubelet/pods/1552df84-b0f3-4af6-b278-429267ed8d59/volumes" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.470589 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.752234 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt"] Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.786456 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" event={"ID":"a5fc159b-5f32-4b3b-945c-22805e0b52f9","Type":"ContainerStarted","Data":"c288e5c89bab6ff8b45637657d48d7c40935608462a8f73f4f7e3a1f30eea427"} Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.788399 4738 generic.go:334] "Generic (PLEG): container finished" podID="4f325caa-c71f-44ae-8312-2241b9b8bc67" containerID="9aa776cfcdbab9636869a5ea15ac1b0a266f0a350ded10223bbf16ea8f246c1c" exitCode=0 Mar 07 07:40:02 crc kubenswrapper[4738]: I0307 07:40:02.788437 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-788v7" event={"ID":"4f325caa-c71f-44ae-8312-2241b9b8bc67","Type":"ContainerDied","Data":"9aa776cfcdbab9636869a5ea15ac1b0a266f0a350ded10223bbf16ea8f246c1c"} Mar 07 07:40:03 crc kubenswrapper[4738]: I0307 07:40:03.801770 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" event={"ID":"a5fc159b-5f32-4b3b-945c-22805e0b52f9","Type":"ContainerStarted","Data":"eedfed731ce66721fbab7ceed07eb14d895a455b05cff80c505f80b989828c27"} Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.096111 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.112095 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" podStartSLOduration=2.112076909 podStartE2EDuration="2.112076909s" podCreationTimestamp="2026-03-07 07:40:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:03.825505368 +0000 UTC m=+2422.290492689" watchObservedRunningTime="2026-03-07 07:40:04.112076909 +0000 UTC m=+2422.577064230" Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.123705 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgpsr\" (UniqueName: \"kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr\") pod \"4f325caa-c71f-44ae-8312-2241b9b8bc67\" (UID: \"4f325caa-c71f-44ae-8312-2241b9b8bc67\") " Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.132147 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr" (OuterVolumeSpecName: "kube-api-access-vgpsr") pod "4f325caa-c71f-44ae-8312-2241b9b8bc67" (UID: "4f325caa-c71f-44ae-8312-2241b9b8bc67"). InnerVolumeSpecName "kube-api-access-vgpsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.225383 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgpsr\" (UniqueName: \"kubernetes.io/projected/4f325caa-c71f-44ae-8312-2241b9b8bc67-kube-api-access-vgpsr\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.821686 4738 generic.go:334] "Generic (PLEG): container finished" podID="a5fc159b-5f32-4b3b-945c-22805e0b52f9" containerID="eedfed731ce66721fbab7ceed07eb14d895a455b05cff80c505f80b989828c27" exitCode=0 Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.821765 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" event={"ID":"a5fc159b-5f32-4b3b-945c-22805e0b52f9","Type":"ContainerDied","Data":"eedfed731ce66721fbab7ceed07eb14d895a455b05cff80c505f80b989828c27"} Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.824019 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547820-788v7" event={"ID":"4f325caa-c71f-44ae-8312-2241b9b8bc67","Type":"ContainerDied","Data":"aa6003fce7f5fd21df0f0f55878548b3f165ca5782a22861e6cc7258a7bad735"} Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.824049 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa6003fce7f5fd21df0f0f55878548b3f165ca5782a22861e6cc7258a7bad735" Mar 07 07:40:04 crc kubenswrapper[4738]: I0307 07:40:04.824071 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547820-788v7" Mar 07 07:40:05 crc kubenswrapper[4738]: I0307 07:40:05.171600 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-wjw5r"] Mar 07 07:40:05 crc kubenswrapper[4738]: I0307 07:40:05.178639 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547814-wjw5r"] Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.188193 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.221940 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt"] Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.226504 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt"] Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265703 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265770 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265829 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mndgp\" (UniqueName: \"kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265853 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265887 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.265936 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf\") pod \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\" (UID: \"a5fc159b-5f32-4b3b-945c-22805e0b52f9\") " Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.266500 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.267532 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.271223 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp" (OuterVolumeSpecName: "kube-api-access-mndgp") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "kube-api-access-mndgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.285748 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts" (OuterVolumeSpecName: "scripts") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.290725 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.293419 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a5fc159b-5f32-4b3b-945c-22805e0b52f9" (UID: "a5fc159b-5f32-4b3b-945c-22805e0b52f9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367375 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367411 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a5fc159b-5f32-4b3b-945c-22805e0b52f9-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367424 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mndgp\" (UniqueName: \"kubernetes.io/projected/a5fc159b-5f32-4b3b-945c-22805e0b52f9-kube-api-access-mndgp\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367439 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a5fc159b-5f32-4b3b-945c-22805e0b52f9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367451 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.367462 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a5fc159b-5f32-4b3b-945c-22805e0b52f9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.398979 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5fc159b-5f32-4b3b-945c-22805e0b52f9" path="/var/lib/kubelet/pods/a5fc159b-5f32-4b3b-945c-22805e0b52f9/volumes" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.399754 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6d4ad66-c41b-42f2-9b0d-3e54aa59443e" path="/var/lib/kubelet/pods/a6d4ad66-c41b-42f2-9b0d-3e54aa59443e/volumes" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.844963 4738 scope.go:117] "RemoveContainer" containerID="eedfed731ce66721fbab7ceed07eb14d895a455b05cff80c505f80b989828c27" Mar 07 07:40:06 crc kubenswrapper[4738]: I0307 07:40:06.845024 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-ljmqt" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.414230 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w"] Mar 07 07:40:07 crc kubenswrapper[4738]: E0307 07:40:07.415915 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f325caa-c71f-44ae-8312-2241b9b8bc67" containerName="oc" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.415952 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f325caa-c71f-44ae-8312-2241b9b8bc67" containerName="oc" Mar 07 07:40:07 crc kubenswrapper[4738]: E0307 07:40:07.415983 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5fc159b-5f32-4b3b-945c-22805e0b52f9" containerName="swift-ring-rebalance" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.415999 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5fc159b-5f32-4b3b-945c-22805e0b52f9" containerName="swift-ring-rebalance" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.416347 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5fc159b-5f32-4b3b-945c-22805e0b52f9" containerName="swift-ring-rebalance" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.416380 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f325caa-c71f-44ae-8312-2241b9b8bc67" containerName="oc" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.417329 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.420124 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.422267 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.430050 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w"] Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.589794 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.589888 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.590074 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.590285 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.590386 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.590518 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2csv\" (UniqueName: \"kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691567 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2csv\" (UniqueName: \"kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691659 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691702 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691727 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691750 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.691793 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.692433 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.692750 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.693490 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.698054 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.698217 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.720858 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2csv\" (UniqueName: \"kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv\") pod \"swift-ring-rebalance-debug-v9d4w\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:07 crc kubenswrapper[4738]: I0307 07:40:07.755059 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:08 crc kubenswrapper[4738]: I0307 07:40:08.064925 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w"] Mar 07 07:40:08 crc kubenswrapper[4738]: I0307 07:40:08.869447 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" event={"ID":"9767681b-294e-4f00-a8b1-aa1a44a8c82c","Type":"ContainerStarted","Data":"4e01c39fab11c7f0a3ec8732745f40155ef723f801743e383953a57cad658908"} Mar 07 07:40:08 crc kubenswrapper[4738]: I0307 07:40:08.869764 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" event={"ID":"9767681b-294e-4f00-a8b1-aa1a44a8c82c","Type":"ContainerStarted","Data":"1efc5f75fc3e6d3913dbf7b4b6f2ae321c8ff6140884c8e588de864edf5d7d76"} Mar 07 07:40:08 crc kubenswrapper[4738]: I0307 07:40:08.887968 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" podStartSLOduration=1.8879416199999999 podStartE2EDuration="1.88794162s" podCreationTimestamp="2026-03-07 07:40:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:08.886333856 +0000 UTC m=+2427.351321217" watchObservedRunningTime="2026-03-07 07:40:08.88794162 +0000 UTC m=+2427.352928961" Mar 07 07:40:09 crc kubenswrapper[4738]: I0307 07:40:09.883184 4738 generic.go:334] "Generic (PLEG): container finished" podID="9767681b-294e-4f00-a8b1-aa1a44a8c82c" containerID="4e01c39fab11c7f0a3ec8732745f40155ef723f801743e383953a57cad658908" exitCode=0 Mar 07 07:40:09 crc kubenswrapper[4738]: I0307 07:40:09.883231 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" event={"ID":"9767681b-294e-4f00-a8b1-aa1a44a8c82c","Type":"ContainerDied","Data":"4e01c39fab11c7f0a3ec8732745f40155ef723f801743e383953a57cad658908"} Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.246484 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.302303 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w"] Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.315838 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w"] Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.352584 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2csv\" (UniqueName: \"kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.352741 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.352853 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.352919 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.353007 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.353078 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf\") pod \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\" (UID: \"9767681b-294e-4f00-a8b1-aa1a44a8c82c\") " Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.353221 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.353612 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.353960 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.358140 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv" (OuterVolumeSpecName: "kube-api-access-f2csv") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "kube-api-access-f2csv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.375056 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.376752 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.378576 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts" (OuterVolumeSpecName: "scripts") pod "9767681b-294e-4f00-a8b1-aa1a44a8c82c" (UID: "9767681b-294e-4f00-a8b1-aa1a44a8c82c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.455313 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9767681b-294e-4f00-a8b1-aa1a44a8c82c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.455354 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.455368 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9767681b-294e-4f00-a8b1-aa1a44a8c82c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.455378 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9767681b-294e-4f00-a8b1-aa1a44a8c82c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.455392 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2csv\" (UniqueName: \"kubernetes.io/projected/9767681b-294e-4f00-a8b1-aa1a44a8c82c-kube-api-access-f2csv\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.905963 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1efc5f75fc3e6d3913dbf7b4b6f2ae321c8ff6140884c8e588de864edf5d7d76" Mar 07 07:40:11 crc kubenswrapper[4738]: I0307 07:40:11.906018 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-v9d4w" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.403045 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9767681b-294e-4f00-a8b1-aa1a44a8c82c" path="/var/lib/kubelet/pods/9767681b-294e-4f00-a8b1-aa1a44a8c82c/volumes" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.450965 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5"] Mar 07 07:40:12 crc kubenswrapper[4738]: E0307 07:40:12.451516 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9767681b-294e-4f00-a8b1-aa1a44a8c82c" containerName="swift-ring-rebalance" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.451547 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9767681b-294e-4f00-a8b1-aa1a44a8c82c" containerName="swift-ring-rebalance" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.451845 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9767681b-294e-4f00-a8b1-aa1a44a8c82c" containerName="swift-ring-rebalance" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.452899 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.457785 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.472147 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.479271 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5"] Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574369 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574547 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574630 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzljw\" (UniqueName: \"kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574704 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574770 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.574861 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676658 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676716 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzljw\" (UniqueName: \"kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676735 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676780 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676822 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.676857 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.677544 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.677902 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.678086 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.689238 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.689310 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.705686 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzljw\" (UniqueName: \"kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw\") pod \"swift-ring-rebalance-debug-fbdz5\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:12 crc kubenswrapper[4738]: I0307 07:40:12.788515 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:13 crc kubenswrapper[4738]: I0307 07:40:13.227119 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5"] Mar 07 07:40:13 crc kubenswrapper[4738]: I0307 07:40:13.927850 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" event={"ID":"e9858b5b-5ac9-4a71-b340-f773bc32b848","Type":"ContainerStarted","Data":"1d6f8543e7c8a885d5b9742fdd7c136bdab0a4aff38306f24a5ad8fdbb3f7163"} Mar 07 07:40:13 crc kubenswrapper[4738]: I0307 07:40:13.929468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" event={"ID":"e9858b5b-5ac9-4a71-b340-f773bc32b848","Type":"ContainerStarted","Data":"25e759612f63569abc1569b0584588c0e5bb2ae2da359d57636b515e0fdb5dfa"} Mar 07 07:40:14 crc kubenswrapper[4738]: I0307 07:40:14.968249 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" podStartSLOduration=2.96822409 podStartE2EDuration="2.96822409s" podCreationTimestamp="2026-03-07 07:40:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:14.955999953 +0000 UTC m=+2433.420987354" watchObservedRunningTime="2026-03-07 07:40:14.96822409 +0000 UTC m=+2433.433211441" Mar 07 07:40:15 crc kubenswrapper[4738]: I0307 07:40:15.960827 4738 generic.go:334] "Generic (PLEG): container finished" podID="e9858b5b-5ac9-4a71-b340-f773bc32b848" containerID="1d6f8543e7c8a885d5b9742fdd7c136bdab0a4aff38306f24a5ad8fdbb3f7163" exitCode=0 Mar 07 07:40:15 crc kubenswrapper[4738]: I0307 07:40:15.960894 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" event={"ID":"e9858b5b-5ac9-4a71-b340-f773bc32b848","Type":"ContainerDied","Data":"1d6f8543e7c8a885d5b9742fdd7c136bdab0a4aff38306f24a5ad8fdbb3f7163"} Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.264933 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.303502 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5"] Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.308460 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5"] Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380458 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380738 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380820 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380845 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzljw\" (UniqueName: \"kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380898 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.380938 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf\") pod \"e9858b5b-5ac9-4a71-b340-f773bc32b848\" (UID: \"e9858b5b-5ac9-4a71-b340-f773bc32b848\") " Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.381178 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.381319 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.382406 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.386876 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw" (OuterVolumeSpecName: "kube-api-access-tzljw") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "kube-api-access-tzljw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.409583 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.412280 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.421592 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts" (OuterVolumeSpecName: "scripts") pod "e9858b5b-5ac9-4a71-b340-f773bc32b848" (UID: "e9858b5b-5ac9-4a71-b340-f773bc32b848"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.483291 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.483349 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e9858b5b-5ac9-4a71-b340-f773bc32b848-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.483368 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e9858b5b-5ac9-4a71-b340-f773bc32b848-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.483386 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e9858b5b-5ac9-4a71-b340-f773bc32b848-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.483405 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzljw\" (UniqueName: \"kubernetes.io/projected/e9858b5b-5ac9-4a71-b340-f773bc32b848-kube-api-access-tzljw\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.988468 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25e759612f63569abc1569b0584588c0e5bb2ae2da359d57636b515e0fdb5dfa" Mar 07 07:40:17 crc kubenswrapper[4738]: I0307 07:40:17.988923 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fbdz5" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.403850 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9858b5b-5ac9-4a71-b340-f773bc32b848" path="/var/lib/kubelet/pods/e9858b5b-5ac9-4a71-b340-f773bc32b848/volumes" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.500997 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrx76"] Mar 07 07:40:18 crc kubenswrapper[4738]: E0307 07:40:18.501327 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9858b5b-5ac9-4a71-b340-f773bc32b848" containerName="swift-ring-rebalance" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.501340 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9858b5b-5ac9-4a71-b340-f773bc32b848" containerName="swift-ring-rebalance" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.501483 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9858b5b-5ac9-4a71-b340-f773bc32b848" containerName="swift-ring-rebalance" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.501999 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.510148 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.510786 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.516940 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrx76"] Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.600936 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.601024 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.601077 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.601110 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.601150 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvr7t\" (UniqueName: \"kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.601214 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703278 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703420 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703506 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703563 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703637 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvr7t\" (UniqueName: \"kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.703706 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.704544 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.705044 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.705082 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.709764 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.712641 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.724482 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvr7t\" (UniqueName: \"kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t\") pod \"swift-ring-rebalance-debug-zrx76\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:18 crc kubenswrapper[4738]: I0307 07:40:18.831251 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:19 crc kubenswrapper[4738]: I0307 07:40:19.141665 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrx76"] Mar 07 07:40:20 crc kubenswrapper[4738]: I0307 07:40:20.046490 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" event={"ID":"5f28265c-127e-40a8-8148-02f26cce389a","Type":"ContainerStarted","Data":"970881ca3efb891b5ebf507eeacd6bb1b036e04b5f582affe53a0c08bd013c5c"} Mar 07 07:40:20 crc kubenswrapper[4738]: I0307 07:40:20.046912 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" event={"ID":"5f28265c-127e-40a8-8148-02f26cce389a","Type":"ContainerStarted","Data":"23faa385eb669b2227fc15754fe2fde2f9829ca9f79a9061c409b77f43429080"} Mar 07 07:40:20 crc kubenswrapper[4738]: I0307 07:40:20.077517 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" podStartSLOduration=2.077496948 podStartE2EDuration="2.077496948s" podCreationTimestamp="2026-03-07 07:40:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:20.067701647 +0000 UTC m=+2438.532688988" watchObservedRunningTime="2026-03-07 07:40:20.077496948 +0000 UTC m=+2438.542484279" Mar 07 07:40:22 crc kubenswrapper[4738]: I0307 07:40:22.071956 4738 generic.go:334] "Generic (PLEG): container finished" podID="5f28265c-127e-40a8-8148-02f26cce389a" containerID="970881ca3efb891b5ebf507eeacd6bb1b036e04b5f582affe53a0c08bd013c5c" exitCode=0 Mar 07 07:40:22 crc kubenswrapper[4738]: I0307 07:40:22.072096 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" event={"ID":"5f28265c-127e-40a8-8148-02f26cce389a","Type":"ContainerDied","Data":"970881ca3efb891b5ebf507eeacd6bb1b036e04b5f582affe53a0c08bd013c5c"} Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.431396 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.486435 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrx76"] Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.499321 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-zrx76"] Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.594997 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvr7t\" (UniqueName: \"kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595069 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595121 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595179 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595288 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595322 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices\") pod \"5f28265c-127e-40a8-8148-02f26cce389a\" (UID: \"5f28265c-127e-40a8-8148-02f26cce389a\") " Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.595812 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.596313 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.604516 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t" (OuterVolumeSpecName: "kube-api-access-zvr7t") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "kube-api-access-zvr7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.617028 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts" (OuterVolumeSpecName: "scripts") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.627410 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.632568 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "5f28265c-127e-40a8-8148-02f26cce389a" (UID: "5f28265c-127e-40a8-8148-02f26cce389a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697708 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697764 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697785 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/5f28265c-127e-40a8-8148-02f26cce389a-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697804 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvr7t\" (UniqueName: \"kubernetes.io/projected/5f28265c-127e-40a8-8148-02f26cce389a-kube-api-access-zvr7t\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697831 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/5f28265c-127e-40a8-8148-02f26cce389a-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:23 crc kubenswrapper[4738]: I0307 07:40:23.697853 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/5f28265c-127e-40a8-8148-02f26cce389a-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.113146 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23faa385eb669b2227fc15754fe2fde2f9829ca9f79a9061c409b77f43429080" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.113252 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-zrx76" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.405011 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f28265c-127e-40a8-8148-02f26cce389a" path="/var/lib/kubelet/pods/5f28265c-127e-40a8-8148-02f26cce389a/volumes" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.627674 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg"] Mar 07 07:40:24 crc kubenswrapper[4738]: E0307 07:40:24.628381 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f28265c-127e-40a8-8148-02f26cce389a" containerName="swift-ring-rebalance" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.628405 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f28265c-127e-40a8-8148-02f26cce389a" containerName="swift-ring-rebalance" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.628609 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f28265c-127e-40a8-8148-02f26cce389a" containerName="swift-ring-rebalance" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.630270 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.633093 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.633809 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.647066 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg"] Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715131 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpstf\" (UniqueName: \"kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715190 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715428 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715502 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.715957 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818106 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpstf\" (UniqueName: \"kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818203 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818276 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818311 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818507 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.818597 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.819706 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.819787 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.819865 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.824824 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.831963 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.841644 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpstf\" (UniqueName: \"kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf\") pod \"swift-ring-rebalance-debug-bjrgg\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:24 crc kubenswrapper[4738]: I0307 07:40:24.953396 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:25 crc kubenswrapper[4738]: I0307 07:40:25.443751 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg"] Mar 07 07:40:25 crc kubenswrapper[4738]: W0307 07:40:25.451475 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05cf80ad_16c3_4d80_a127_02d632a0a024.slice/crio-c914fe65a316e36cadd34a5942e937883fc1f7aac18c81aba880fecbd411a50d WatchSource:0}: Error finding container c914fe65a316e36cadd34a5942e937883fc1f7aac18c81aba880fecbd411a50d: Status 404 returned error can't find the container with id c914fe65a316e36cadd34a5942e937883fc1f7aac18c81aba880fecbd411a50d Mar 07 07:40:26 crc kubenswrapper[4738]: I0307 07:40:26.131373 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" event={"ID":"05cf80ad-16c3-4d80-a127-02d632a0a024","Type":"ContainerStarted","Data":"9ac421ac9a273f592cdfd7681b68d521cebc53b64a606b7905a3cd78d7639e2c"} Mar 07 07:40:26 crc kubenswrapper[4738]: I0307 07:40:26.131675 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" event={"ID":"05cf80ad-16c3-4d80-a127-02d632a0a024","Type":"ContainerStarted","Data":"c914fe65a316e36cadd34a5942e937883fc1f7aac18c81aba880fecbd411a50d"} Mar 07 07:40:26 crc kubenswrapper[4738]: I0307 07:40:26.151979 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" podStartSLOduration=2.151961694 podStartE2EDuration="2.151961694s" podCreationTimestamp="2026-03-07 07:40:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:26.14619206 +0000 UTC m=+2444.611179381" watchObservedRunningTime="2026-03-07 07:40:26.151961694 +0000 UTC m=+2444.616949015" Mar 07 07:40:26 crc kubenswrapper[4738]: I0307 07:40:26.957670 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:40:26 crc kubenswrapper[4738]: I0307 07:40:26.957979 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:40:27 crc kubenswrapper[4738]: I0307 07:40:27.141918 4738 generic.go:334] "Generic (PLEG): container finished" podID="05cf80ad-16c3-4d80-a127-02d632a0a024" containerID="9ac421ac9a273f592cdfd7681b68d521cebc53b64a606b7905a3cd78d7639e2c" exitCode=0 Mar 07 07:40:27 crc kubenswrapper[4738]: I0307 07:40:27.142023 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" event={"ID":"05cf80ad-16c3-4d80-a127-02d632a0a024","Type":"ContainerDied","Data":"9ac421ac9a273f592cdfd7681b68d521cebc53b64a606b7905a3cd78d7639e2c"} Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.454384 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.496967 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg"] Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.505373 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg"] Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574421 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574506 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574546 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574602 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574642 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpstf\" (UniqueName: \"kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.574671 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts\") pod \"05cf80ad-16c3-4d80-a127-02d632a0a024\" (UID: \"05cf80ad-16c3-4d80-a127-02d632a0a024\") " Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.576758 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.576864 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.580930 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf" (OuterVolumeSpecName: "kube-api-access-mpstf") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "kube-api-access-mpstf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.604826 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts" (OuterVolumeSpecName: "scripts") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.608762 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.611480 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "05cf80ad-16c3-4d80-a127-02d632a0a024" (UID: "05cf80ad-16c3-4d80-a127-02d632a0a024"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676576 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676606 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/05cf80ad-16c3-4d80-a127-02d632a0a024-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676616 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/05cf80ad-16c3-4d80-a127-02d632a0a024-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676624 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676634 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpstf\" (UniqueName: \"kubernetes.io/projected/05cf80ad-16c3-4d80-a127-02d632a0a024-kube-api-access-mpstf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:28 crc kubenswrapper[4738]: I0307 07:40:28.676644 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/05cf80ad-16c3-4d80-a127-02d632a0a024-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.169270 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-bjrgg" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.171363 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c914fe65a316e36cadd34a5942e937883fc1f7aac18c81aba880fecbd411a50d" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.651303 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wglch"] Mar 07 07:40:29 crc kubenswrapper[4738]: E0307 07:40:29.652466 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cf80ad-16c3-4d80-a127-02d632a0a024" containerName="swift-ring-rebalance" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.652575 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cf80ad-16c3-4d80-a127-02d632a0a024" containerName="swift-ring-rebalance" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.652921 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cf80ad-16c3-4d80-a127-02d632a0a024" containerName="swift-ring-rebalance" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.653878 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.656393 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.656856 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.668349 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wglch"] Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.766750 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.766812 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.766905 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.766946 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.766991 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.767212 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6s7\" (UniqueName: \"kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868450 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868557 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868646 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868684 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868735 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868813 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6s7\" (UniqueName: \"kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.868892 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.869507 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.870114 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.875374 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.875981 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.899366 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6s7\" (UniqueName: \"kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7\") pod \"swift-ring-rebalance-debug-wglch\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:29 crc kubenswrapper[4738]: I0307 07:40:29.982019 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:30 crc kubenswrapper[4738]: I0307 07:40:30.394379 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cf80ad-16c3-4d80-a127-02d632a0a024" path="/var/lib/kubelet/pods/05cf80ad-16c3-4d80-a127-02d632a0a024/volumes" Mar 07 07:40:30 crc kubenswrapper[4738]: I0307 07:40:30.410388 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wglch"] Mar 07 07:40:30 crc kubenswrapper[4738]: W0307 07:40:30.413486 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6cbec8c_92f0_46ff_a58c_e5ce40fc3599.slice/crio-a42ab3d10a810deaa38178c20629f8d4dbdfc2142af00373eb355e067689891b WatchSource:0}: Error finding container a42ab3d10a810deaa38178c20629f8d4dbdfc2142af00373eb355e067689891b: Status 404 returned error can't find the container with id a42ab3d10a810deaa38178c20629f8d4dbdfc2142af00373eb355e067689891b Mar 07 07:40:31 crc kubenswrapper[4738]: I0307 07:40:31.194024 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" event={"ID":"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599","Type":"ContainerStarted","Data":"b1dd50e628df8e935a3aea935e54a1b59e7007b3551ee2d3f78864bcde759079"} Mar 07 07:40:31 crc kubenswrapper[4738]: I0307 07:40:31.194467 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" event={"ID":"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599","Type":"ContainerStarted","Data":"a42ab3d10a810deaa38178c20629f8d4dbdfc2142af00373eb355e067689891b"} Mar 07 07:40:31 crc kubenswrapper[4738]: I0307 07:40:31.226031 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" podStartSLOduration=2.226004625 podStartE2EDuration="2.226004625s" podCreationTimestamp="2026-03-07 07:40:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:31.217861778 +0000 UTC m=+2449.682849119" watchObservedRunningTime="2026-03-07 07:40:31.226004625 +0000 UTC m=+2449.690991976" Mar 07 07:40:32 crc kubenswrapper[4738]: I0307 07:40:32.208948 4738 generic.go:334] "Generic (PLEG): container finished" podID="a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" containerID="b1dd50e628df8e935a3aea935e54a1b59e7007b3551ee2d3f78864bcde759079" exitCode=0 Mar 07 07:40:32 crc kubenswrapper[4738]: I0307 07:40:32.209030 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" event={"ID":"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599","Type":"ContainerDied","Data":"b1dd50e628df8e935a3aea935e54a1b59e7007b3551ee2d3f78864bcde759079"} Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.561607 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.596599 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wglch"] Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.608992 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-wglch"] Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.726767 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.726881 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.726918 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.726963 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.727030 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn6s7\" (UniqueName: \"kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.727066 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices\") pod \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\" (UID: \"a6cbec8c-92f0-46ff-a58c-e5ce40fc3599\") " Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.727939 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.730148 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.754389 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.776445 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7" (OuterVolumeSpecName: "kube-api-access-rn6s7") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "kube-api-access-rn6s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.807332 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.825639 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts" (OuterVolumeSpecName: "scripts") pod "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" (UID: "a6cbec8c-92f0-46ff-a58c-e5ce40fc3599"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829114 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829139 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn6s7\" (UniqueName: \"kubernetes.io/projected/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-kube-api-access-rn6s7\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829151 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829197 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829208 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:33 crc kubenswrapper[4738]: I0307 07:40:33.829216 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.230131 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a42ab3d10a810deaa38178c20629f8d4dbdfc2142af00373eb355e067689891b" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.230243 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-wglch" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.400822 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" path="/var/lib/kubelet/pods/a6cbec8c-92f0-46ff-a58c-e5ce40fc3599/volumes" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.797843 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9"] Mar 07 07:40:34 crc kubenswrapper[4738]: E0307 07:40:34.799479 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" containerName="swift-ring-rebalance" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.799613 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" containerName="swift-ring-rebalance" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.799948 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6cbec8c-92f0-46ff-a58c-e5ce40fc3599" containerName="swift-ring-rebalance" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.800939 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.804491 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.806657 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.813607 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9"] Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947008 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947095 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947117 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947137 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947177 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhwxk\" (UniqueName: \"kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:34 crc kubenswrapper[4738]: I0307 07:40:34.947217 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049351 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhwxk\" (UniqueName: \"kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049453 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049544 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049619 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049644 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.049664 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.050303 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.051137 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.051886 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.055149 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.055763 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.073104 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhwxk\" (UniqueName: \"kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk\") pod \"swift-ring-rebalance-debug-9r4s9\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.135356 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:35 crc kubenswrapper[4738]: I0307 07:40:35.603170 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9"] Mar 07 07:40:35 crc kubenswrapper[4738]: W0307 07:40:35.612911 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod122a9ebd_108c_435d_921e_ad99b1b4ca4f.slice/crio-9c7e8cc73b4e524ab38d2d0815f7a6b038907a01423293952b3bab300ecb4240 WatchSource:0}: Error finding container 9c7e8cc73b4e524ab38d2d0815f7a6b038907a01423293952b3bab300ecb4240: Status 404 returned error can't find the container with id 9c7e8cc73b4e524ab38d2d0815f7a6b038907a01423293952b3bab300ecb4240 Mar 07 07:40:36 crc kubenswrapper[4738]: I0307 07:40:36.249422 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" event={"ID":"122a9ebd-108c-435d-921e-ad99b1b4ca4f","Type":"ContainerStarted","Data":"b10c17d9490e9914efc5e0ac5da845a28f8b08a16a98e4c4781722cb12288844"} Mar 07 07:40:36 crc kubenswrapper[4738]: I0307 07:40:36.249731 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" event={"ID":"122a9ebd-108c-435d-921e-ad99b1b4ca4f","Type":"ContainerStarted","Data":"9c7e8cc73b4e524ab38d2d0815f7a6b038907a01423293952b3bab300ecb4240"} Mar 07 07:40:36 crc kubenswrapper[4738]: I0307 07:40:36.271898 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" podStartSLOduration=2.271882855 podStartE2EDuration="2.271882855s" podCreationTimestamp="2026-03-07 07:40:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:36.266031369 +0000 UTC m=+2454.731018710" watchObservedRunningTime="2026-03-07 07:40:36.271882855 +0000 UTC m=+2454.736870176" Mar 07 07:40:38 crc kubenswrapper[4738]: I0307 07:40:38.273108 4738 generic.go:334] "Generic (PLEG): container finished" podID="122a9ebd-108c-435d-921e-ad99b1b4ca4f" containerID="b10c17d9490e9914efc5e0ac5da845a28f8b08a16a98e4c4781722cb12288844" exitCode=0 Mar 07 07:40:38 crc kubenswrapper[4738]: I0307 07:40:38.273214 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" event={"ID":"122a9ebd-108c-435d-921e-ad99b1b4ca4f","Type":"ContainerDied","Data":"b10c17d9490e9914efc5e0ac5da845a28f8b08a16a98e4c4781722cb12288844"} Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.600725 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.640837 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9"] Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.647633 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9"] Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.724571 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.724673 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.724697 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.724764 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhwxk\" (UniqueName: \"kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.725722 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.725753 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf\") pod \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\" (UID: \"122a9ebd-108c-435d-921e-ad99b1b4ca4f\") " Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.726366 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.726418 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.731384 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk" (OuterVolumeSpecName: "kube-api-access-hhwxk") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "kube-api-access-hhwxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.750140 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.750574 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.751420 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts" (OuterVolumeSpecName: "scripts") pod "122a9ebd-108c-435d-921e-ad99b1b4ca4f" (UID: "122a9ebd-108c-435d-921e-ad99b1b4ca4f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827535 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhwxk\" (UniqueName: \"kubernetes.io/projected/122a9ebd-108c-435d-921e-ad99b1b4ca4f-kube-api-access-hhwxk\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827567 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827578 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827586 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/122a9ebd-108c-435d-921e-ad99b1b4ca4f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827594 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/122a9ebd-108c-435d-921e-ad99b1b4ca4f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:39 crc kubenswrapper[4738]: I0307 07:40:39.827603 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/122a9ebd-108c-435d-921e-ad99b1b4ca4f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.300233 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c7e8cc73b4e524ab38d2d0815f7a6b038907a01423293952b3bab300ecb4240" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.300318 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9r4s9" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.393810 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="122a9ebd-108c-435d-921e-ad99b1b4ca4f" path="/var/lib/kubelet/pods/122a9ebd-108c-435d-921e-ad99b1b4ca4f/volumes" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.838993 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg"] Mar 07 07:40:40 crc kubenswrapper[4738]: E0307 07:40:40.839772 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="122a9ebd-108c-435d-921e-ad99b1b4ca4f" containerName="swift-ring-rebalance" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.839793 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="122a9ebd-108c-435d-921e-ad99b1b4ca4f" containerName="swift-ring-rebalance" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.840006 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="122a9ebd-108c-435d-921e-ad99b1b4ca4f" containerName="swift-ring-rebalance" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.840842 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.844995 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.845657 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.846691 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg"] Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849259 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849382 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849420 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849470 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ctsp\" (UniqueName: \"kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849627 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.849696 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.950968 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.951034 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.951109 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.951147 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.951197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.951225 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ctsp\" (UniqueName: \"kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.952058 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.952289 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.952636 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.956358 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.956469 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:40 crc kubenswrapper[4738]: I0307 07:40:40.982931 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ctsp\" (UniqueName: \"kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp\") pod \"swift-ring-rebalance-debug-7z6xg\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:41 crc kubenswrapper[4738]: I0307 07:40:41.194208 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:41 crc kubenswrapper[4738]: I0307 07:40:41.628936 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg"] Mar 07 07:40:42 crc kubenswrapper[4738]: I0307 07:40:42.334778 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" event={"ID":"b380f515-f2f7-41e5-ab55-44338086a792","Type":"ContainerStarted","Data":"b971718c5b4a2f871edb9241bc8aa233afa17f97673e268677abe2711aa49fc2"} Mar 07 07:40:42 crc kubenswrapper[4738]: I0307 07:40:42.335141 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" event={"ID":"b380f515-f2f7-41e5-ab55-44338086a792","Type":"ContainerStarted","Data":"3eae8de010f40ff1e287306dee430bb57ea23d6633d7bf8641ce1b48693f56ef"} Mar 07 07:40:42 crc kubenswrapper[4738]: I0307 07:40:42.359116 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" podStartSLOduration=2.359100761 podStartE2EDuration="2.359100761s" podCreationTimestamp="2026-03-07 07:40:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:42.356381338 +0000 UTC m=+2460.821368659" watchObservedRunningTime="2026-03-07 07:40:42.359100761 +0000 UTC m=+2460.824088082" Mar 07 07:40:43 crc kubenswrapper[4738]: I0307 07:40:43.351706 4738 generic.go:334] "Generic (PLEG): container finished" podID="b380f515-f2f7-41e5-ab55-44338086a792" containerID="b971718c5b4a2f871edb9241bc8aa233afa17f97673e268677abe2711aa49fc2" exitCode=0 Mar 07 07:40:43 crc kubenswrapper[4738]: I0307 07:40:43.351907 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" event={"ID":"b380f515-f2f7-41e5-ab55-44338086a792","Type":"ContainerDied","Data":"b971718c5b4a2f871edb9241bc8aa233afa17f97673e268677abe2711aa49fc2"} Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.649546 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.679561 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg"] Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.685153 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg"] Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.810893 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811002 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811070 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811093 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811113 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ctsp\" (UniqueName: \"kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811132 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts\") pod \"b380f515-f2f7-41e5-ab55-44338086a792\" (UID: \"b380f515-f2f7-41e5-ab55-44338086a792\") " Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.811661 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.812188 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.817298 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp" (OuterVolumeSpecName: "kube-api-access-6ctsp") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "kube-api-access-6ctsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.838858 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts" (OuterVolumeSpecName: "scripts") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.842357 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.842907 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b380f515-f2f7-41e5-ab55-44338086a792" (UID: "b380f515-f2f7-41e5-ab55-44338086a792"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912399 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912445 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b380f515-f2f7-41e5-ab55-44338086a792-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912459 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912470 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b380f515-f2f7-41e5-ab55-44338086a792-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912483 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ctsp\" (UniqueName: \"kubernetes.io/projected/b380f515-f2f7-41e5-ab55-44338086a792-kube-api-access-6ctsp\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:44 crc kubenswrapper[4738]: I0307 07:40:44.912496 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b380f515-f2f7-41e5-ab55-44338086a792-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.377604 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eae8de010f40ff1e287306dee430bb57ea23d6633d7bf8641ce1b48693f56ef" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.377672 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z6xg" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.884346 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c"] Mar 07 07:40:45 crc kubenswrapper[4738]: E0307 07:40:45.885217 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b380f515-f2f7-41e5-ab55-44338086a792" containerName="swift-ring-rebalance" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.885237 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b380f515-f2f7-41e5-ab55-44338086a792" containerName="swift-ring-rebalance" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.885436 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b380f515-f2f7-41e5-ab55-44338086a792" containerName="swift-ring-rebalance" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.886100 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.888703 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.889554 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.896307 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c"] Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927699 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927740 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927765 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cl4s\" (UniqueName: \"kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927787 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927861 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:45 crc kubenswrapper[4738]: I0307 07:40:45.927937 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.028911 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029021 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029040 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029060 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cl4s\" (UniqueName: \"kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029083 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029124 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.029474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.030042 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.030098 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.033699 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.040063 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.049211 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cl4s\" (UniqueName: \"kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s\") pod \"swift-ring-rebalance-debug-sdr5c\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.210777 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.393651 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b380f515-f2f7-41e5-ab55-44338086a792" path="/var/lib/kubelet/pods/b380f515-f2f7-41e5-ab55-44338086a792/volumes" Mar 07 07:40:46 crc kubenswrapper[4738]: I0307 07:40:46.665339 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c"] Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.426718 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" event={"ID":"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae","Type":"ContainerStarted","Data":"d4ebc34390da71d0c51f2f98a94549a896e291e241400dc2ad9027b629dfa3f7"} Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.429113 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" event={"ID":"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae","Type":"ContainerStarted","Data":"ec44d2646fff8eea982f8e68805a24b46a713c7a3c14d65beabbceecaa7c7a1a"} Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.638391 4738 scope.go:117] "RemoveContainer" containerID="3ef8089f860ad2bace4896f0738fb35a7660185b99d8865471c7248248b4272f" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.689981 4738 scope.go:117] "RemoveContainer" containerID="3cd362563b4d56f58b6239926eeb310ced33b46c2cb34634f1de4346cca22c87" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.735087 4738 scope.go:117] "RemoveContainer" containerID="5907f433b5c5166e09640ce5121a9f400f2088580f7f8e55d10611dc4060437c" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.773909 4738 scope.go:117] "RemoveContainer" containerID="40a15b6b1eb4dec92598e78e4f7c4576fffb3157b7eb7c0f673f6212da31e05a" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.809501 4738 scope.go:117] "RemoveContainer" containerID="52f757dc617b73618016cb8816cc15f3a0f3d38c50a37f78f741d5ef77340d30" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.851927 4738 scope.go:117] "RemoveContainer" containerID="d9d1ee8b76764a0fe674c7734b21dd1b4edda111ff8a4376b8a3a7e7b306938e" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.880525 4738 scope.go:117] "RemoveContainer" containerID="693a9c223b7503ece2072a18d5844b4534777b36159c195ebf6f3273a9d6d87d" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.913381 4738 scope.go:117] "RemoveContainer" containerID="a0036f66f349d52b02c46626e418de735262423574bd6ba21835f94fa0e65971" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.948650 4738 scope.go:117] "RemoveContainer" containerID="5779b2d284d0aa6027d3dc6627f5da49255dc59b40064b64a27cea8d8cd31a35" Mar 07 07:40:47 crc kubenswrapper[4738]: I0307 07:40:47.974408 4738 scope.go:117] "RemoveContainer" containerID="6f10ac55ca698b7413396ef4c7e04d5938349370aa55ae50ce9f88154b926dcb" Mar 07 07:40:48 crc kubenswrapper[4738]: I0307 07:40:48.008566 4738 scope.go:117] "RemoveContainer" containerID="17905e1fc64d702c3f89ac471a11ca4e9424f2523386ca65ba23f2ddff9e1add" Mar 07 07:40:48 crc kubenswrapper[4738]: I0307 07:40:48.436650 4738 generic.go:334] "Generic (PLEG): container finished" podID="0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" containerID="d4ebc34390da71d0c51f2f98a94549a896e291e241400dc2ad9027b629dfa3f7" exitCode=0 Mar 07 07:40:48 crc kubenswrapper[4738]: I0307 07:40:48.436697 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" event={"ID":"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae","Type":"ContainerDied","Data":"d4ebc34390da71d0c51f2f98a94549a896e291e241400dc2ad9027b629dfa3f7"} Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.765853 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.807884 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c"] Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.814084 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c"] Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892389 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cl4s\" (UniqueName: \"kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892561 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892655 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892702 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.892795 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift\") pod \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\" (UID: \"0e36b7a3-4bb4-4a1b-8868-9554fa71bcae\") " Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.893534 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.894325 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.906448 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s" (OuterVolumeSpecName: "kube-api-access-7cl4s") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "kube-api-access-7cl4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.916358 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.918338 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.930453 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts" (OuterVolumeSpecName: "scripts") pod "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" (UID: "0e36b7a3-4bb4-4a1b-8868-9554fa71bcae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994629 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cl4s\" (UniqueName: \"kubernetes.io/projected/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-kube-api-access-7cl4s\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994680 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994700 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994717 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994733 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:49 crc kubenswrapper[4738]: I0307 07:40:49.994751 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.395432 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" path="/var/lib/kubelet/pods/0e36b7a3-4bb4-4a1b-8868-9554fa71bcae/volumes" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.461577 4738 scope.go:117] "RemoveContainer" containerID="d4ebc34390da71d0c51f2f98a94549a896e291e241400dc2ad9027b629dfa3f7" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.461652 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-sdr5c" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.996356 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv"] Mar 07 07:40:50 crc kubenswrapper[4738]: E0307 07:40:50.996926 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" containerName="swift-ring-rebalance" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.996950 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" containerName="swift-ring-rebalance" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.997242 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e36b7a3-4bb4-4a1b-8868-9554fa71bcae" containerName="swift-ring-rebalance" Mar 07 07:40:50 crc kubenswrapper[4738]: I0307 07:40:50.998104 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.000078 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.001368 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.008658 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv"] Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.009815 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.021295 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.021488 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.021526 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.021564 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.021727 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hz8qd\" (UniqueName: \"kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.123603 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.123694 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.123842 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hz8qd\" (UniqueName: \"kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.123893 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.123960 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.124082 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.124544 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.124924 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.125644 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.130169 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.130816 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.143700 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hz8qd\" (UniqueName: \"kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd\") pod \"swift-ring-rebalance-debug-nsgpv\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.327307 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:51 crc kubenswrapper[4738]: I0307 07:40:51.749568 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv"] Mar 07 07:40:52 crc kubenswrapper[4738]: I0307 07:40:52.484509 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" event={"ID":"e7a47848-ab71-415f-8395-b33d6ce425be","Type":"ContainerStarted","Data":"b13bafba945e0d6abf55d42d8391c8fcbf9cd0a204c967dc3f1d4313a0f56b1f"} Mar 07 07:40:52 crc kubenswrapper[4738]: I0307 07:40:52.484851 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" event={"ID":"e7a47848-ab71-415f-8395-b33d6ce425be","Type":"ContainerStarted","Data":"4a94bd12ba9de6ad323a42c07ce1308c7c6fd73b751505d2d1ddc184deff1d6d"} Mar 07 07:40:52 crc kubenswrapper[4738]: I0307 07:40:52.507889 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" podStartSLOduration=2.507869434 podStartE2EDuration="2.507869434s" podCreationTimestamp="2026-03-07 07:40:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:52.506990161 +0000 UTC m=+2470.971977502" watchObservedRunningTime="2026-03-07 07:40:52.507869434 +0000 UTC m=+2470.972856755" Mar 07 07:40:53 crc kubenswrapper[4738]: I0307 07:40:53.499236 4738 generic.go:334] "Generic (PLEG): container finished" podID="e7a47848-ab71-415f-8395-b33d6ce425be" containerID="b13bafba945e0d6abf55d42d8391c8fcbf9cd0a204c967dc3f1d4313a0f56b1f" exitCode=0 Mar 07 07:40:53 crc kubenswrapper[4738]: I0307 07:40:53.499309 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" event={"ID":"e7a47848-ab71-415f-8395-b33d6ce425be","Type":"ContainerDied","Data":"b13bafba945e0d6abf55d42d8391c8fcbf9cd0a204c967dc3f1d4313a0f56b1f"} Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.782957 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.818667 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv"] Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.827466 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv"] Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.884885 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.884937 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.885014 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hz8qd\" (UniqueName: \"kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.885045 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.885182 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.885256 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf\") pod \"e7a47848-ab71-415f-8395-b33d6ce425be\" (UID: \"e7a47848-ab71-415f-8395-b33d6ce425be\") " Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.885976 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.886664 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.898336 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd" (OuterVolumeSpecName: "kube-api-access-hz8qd") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "kube-api-access-hz8qd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.912662 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.913295 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts" (OuterVolumeSpecName: "scripts") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.916201 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e7a47848-ab71-415f-8395-b33d6ce425be" (UID: "e7a47848-ab71-415f-8395-b33d6ce425be"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987241 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987279 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987292 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7a47848-ab71-415f-8395-b33d6ce425be-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987301 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e7a47848-ab71-415f-8395-b33d6ce425be-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987310 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hz8qd\" (UniqueName: \"kubernetes.io/projected/e7a47848-ab71-415f-8395-b33d6ce425be-kube-api-access-hz8qd\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:54 crc kubenswrapper[4738]: I0307 07:40:54.987320 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e7a47848-ab71-415f-8395-b33d6ce425be-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.525007 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a94bd12ba9de6ad323a42c07ce1308c7c6fd73b751505d2d1ddc184deff1d6d" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.525117 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-nsgpv" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.956860 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q"] Mar 07 07:40:55 crc kubenswrapper[4738]: E0307 07:40:55.957468 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7a47848-ab71-415f-8395-b33d6ce425be" containerName="swift-ring-rebalance" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.957484 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7a47848-ab71-415f-8395-b33d6ce425be" containerName="swift-ring-rebalance" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.957675 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7a47848-ab71-415f-8395-b33d6ce425be" containerName="swift-ring-rebalance" Mar 07 07:40:55 crc kubenswrapper[4738]: I0307 07:40:55.958177 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:55.963306 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:55.963577 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:55.969921 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q"] Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001478 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001521 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001557 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001585 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001616 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.001631 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnkcn\" (UniqueName: \"kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103183 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103278 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103330 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103379 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103403 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnkcn\" (UniqueName: \"kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103516 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.103953 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.104379 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.104523 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.106635 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.116728 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.132990 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnkcn\" (UniqueName: \"kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn\") pod \"swift-ring-rebalance-debug-vrt9q\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.326694 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.400366 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7a47848-ab71-415f-8395-b33d6ce425be" path="/var/lib/kubelet/pods/e7a47848-ab71-415f-8395-b33d6ce425be/volumes" Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.571896 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q"] Mar 07 07:40:56 crc kubenswrapper[4738]: W0307 07:40:56.581560 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f5a968d_8af1_4fe6_adce_962c935f08a9.slice/crio-0d21ecff0c83b59b34d0be0ae7f09acd12ccea4e4c058c1c25c67050fee3a094 WatchSource:0}: Error finding container 0d21ecff0c83b59b34d0be0ae7f09acd12ccea4e4c058c1c25c67050fee3a094: Status 404 returned error can't find the container with id 0d21ecff0c83b59b34d0be0ae7f09acd12ccea4e4c058c1c25c67050fee3a094 Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.958526 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:40:56 crc kubenswrapper[4738]: I0307 07:40:56.958605 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:40:57 crc kubenswrapper[4738]: I0307 07:40:57.568286 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" event={"ID":"1f5a968d-8af1-4fe6-adce-962c935f08a9","Type":"ContainerStarted","Data":"d655d7a2311bc4e0b0a93145b26b61e885cd5c9b775558923f6131c9a848ee55"} Mar 07 07:40:57 crc kubenswrapper[4738]: I0307 07:40:57.568610 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" event={"ID":"1f5a968d-8af1-4fe6-adce-962c935f08a9","Type":"ContainerStarted","Data":"0d21ecff0c83b59b34d0be0ae7f09acd12ccea4e4c058c1c25c67050fee3a094"} Mar 07 07:40:57 crc kubenswrapper[4738]: I0307 07:40:57.599625 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" podStartSLOduration=2.599599351 podStartE2EDuration="2.599599351s" podCreationTimestamp="2026-03-07 07:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:40:57.593019275 +0000 UTC m=+2476.058006626" watchObservedRunningTime="2026-03-07 07:40:57.599599351 +0000 UTC m=+2476.064586702" Mar 07 07:40:58 crc kubenswrapper[4738]: I0307 07:40:58.581308 4738 generic.go:334] "Generic (PLEG): container finished" podID="1f5a968d-8af1-4fe6-adce-962c935f08a9" containerID="d655d7a2311bc4e0b0a93145b26b61e885cd5c9b775558923f6131c9a848ee55" exitCode=0 Mar 07 07:40:58 crc kubenswrapper[4738]: I0307 07:40:58.581364 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" event={"ID":"1f5a968d-8af1-4fe6-adce-962c935f08a9","Type":"ContainerDied","Data":"d655d7a2311bc4e0b0a93145b26b61e885cd5c9b775558923f6131c9a848ee55"} Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.865520 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.910702 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q"] Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.923627 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q"] Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976096 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976202 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnkcn\" (UniqueName: \"kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976252 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976279 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976306 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.976405 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf\") pod \"1f5a968d-8af1-4fe6-adce-962c935f08a9\" (UID: \"1f5a968d-8af1-4fe6-adce-962c935f08a9\") " Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.977343 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.977483 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.982032 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn" (OuterVolumeSpecName: "kube-api-access-dnkcn") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "kube-api-access-dnkcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:40:59 crc kubenswrapper[4738]: I0307 07:40:59.997337 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.000268 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts" (OuterVolumeSpecName: "scripts") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.015113 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1f5a968d-8af1-4fe6-adce-962c935f08a9" (UID: "1f5a968d-8af1-4fe6-adce-962c935f08a9"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078126 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078182 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078195 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dnkcn\" (UniqueName: \"kubernetes.io/projected/1f5a968d-8af1-4fe6-adce-962c935f08a9-kube-api-access-dnkcn\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078210 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1f5a968d-8af1-4fe6-adce-962c935f08a9-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078220 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1f5a968d-8af1-4fe6-adce-962c935f08a9-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.078231 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1f5a968d-8af1-4fe6-adce-962c935f08a9-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.398736 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f5a968d-8af1-4fe6-adce-962c935f08a9" path="/var/lib/kubelet/pods/1f5a968d-8af1-4fe6-adce-962c935f08a9/volumes" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.602943 4738 scope.go:117] "RemoveContainer" containerID="d655d7a2311bc4e0b0a93145b26b61e885cd5c9b775558923f6131c9a848ee55" Mar 07 07:41:00 crc kubenswrapper[4738]: I0307 07:41:00.602971 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-vrt9q" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.075388 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr"] Mar 07 07:41:01 crc kubenswrapper[4738]: E0307 07:41:01.077239 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f5a968d-8af1-4fe6-adce-962c935f08a9" containerName="swift-ring-rebalance" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.077399 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f5a968d-8af1-4fe6-adce-962c935f08a9" containerName="swift-ring-rebalance" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.077803 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f5a968d-8af1-4fe6-adce-962c935f08a9" containerName="swift-ring-rebalance" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.078733 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.081453 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.081587 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.083815 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr"] Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.194995 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.195490 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.195680 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr9ns\" (UniqueName: \"kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.195923 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.196135 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.196372 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.297741 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.298284 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.298481 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.298691 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.298691 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.299006 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.299134 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.299206 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr9ns\" (UniqueName: \"kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.300250 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.315859 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.315924 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.322137 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr9ns\" (UniqueName: \"kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns\") pod \"swift-ring-rebalance-debug-f6fgr\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.407538 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:01 crc kubenswrapper[4738]: I0307 07:41:01.841626 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr"] Mar 07 07:41:02 crc kubenswrapper[4738]: I0307 07:41:02.636213 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" event={"ID":"df2f56b1-3af2-43aa-9118-4ea47186d7c6","Type":"ContainerStarted","Data":"5b4d45e959be0a7c1b5187901c7243a2d5e1c9b82bcb5b528b971370af455bae"} Mar 07 07:41:02 crc kubenswrapper[4738]: I0307 07:41:02.636525 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" event={"ID":"df2f56b1-3af2-43aa-9118-4ea47186d7c6","Type":"ContainerStarted","Data":"db032c07f1d132a6a2c3ee851a738b721824dc0a400a3400ae7952dee5434e33"} Mar 07 07:41:02 crc kubenswrapper[4738]: I0307 07:41:02.660927 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" podStartSLOduration=1.6609052819999999 podStartE2EDuration="1.660905282s" podCreationTimestamp="2026-03-07 07:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:02.655013104 +0000 UTC m=+2481.120000435" watchObservedRunningTime="2026-03-07 07:41:02.660905282 +0000 UTC m=+2481.125892603" Mar 07 07:41:03 crc kubenswrapper[4738]: I0307 07:41:03.645880 4738 generic.go:334] "Generic (PLEG): container finished" podID="df2f56b1-3af2-43aa-9118-4ea47186d7c6" containerID="5b4d45e959be0a7c1b5187901c7243a2d5e1c9b82bcb5b528b971370af455bae" exitCode=0 Mar 07 07:41:03 crc kubenswrapper[4738]: I0307 07:41:03.645970 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" event={"ID":"df2f56b1-3af2-43aa-9118-4ea47186d7c6","Type":"ContainerDied","Data":"5b4d45e959be0a7c1b5187901c7243a2d5e1c9b82bcb5b528b971370af455bae"} Mar 07 07:41:04 crc kubenswrapper[4738]: I0307 07:41:04.969907 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.001038 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr"] Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.006879 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr"] Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.169880 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dr9ns\" (UniqueName: \"kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.169960 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.170056 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.170105 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.170128 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.170186 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf\") pod \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\" (UID: \"df2f56b1-3af2-43aa-9118-4ea47186d7c6\") " Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.171083 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.171374 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.174723 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns" (OuterVolumeSpecName: "kube-api-access-dr9ns") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "kube-api-access-dr9ns". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.196806 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.205540 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.217512 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts" (OuterVolumeSpecName: "scripts") pod "df2f56b1-3af2-43aa-9118-4ea47186d7c6" (UID: "df2f56b1-3af2-43aa-9118-4ea47186d7c6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271594 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271630 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271641 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/df2f56b1-3af2-43aa-9118-4ea47186d7c6-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271651 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/df2f56b1-3af2-43aa-9118-4ea47186d7c6-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271661 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/df2f56b1-3af2-43aa-9118-4ea47186d7c6-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.271670 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dr9ns\" (UniqueName: \"kubernetes.io/projected/df2f56b1-3af2-43aa-9118-4ea47186d7c6-kube-api-access-dr9ns\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.665476 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db032c07f1d132a6a2c3ee851a738b721824dc0a400a3400ae7952dee5434e33" Mar 07 07:41:05 crc kubenswrapper[4738]: I0307 07:41:05.665528 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-f6fgr" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.148635 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6"] Mar 07 07:41:06 crc kubenswrapper[4738]: E0307 07:41:06.148926 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df2f56b1-3af2-43aa-9118-4ea47186d7c6" containerName="swift-ring-rebalance" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.148938 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="df2f56b1-3af2-43aa-9118-4ea47186d7c6" containerName="swift-ring-rebalance" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.149099 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="df2f56b1-3af2-43aa-9118-4ea47186d7c6" containerName="swift-ring-rebalance" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.149618 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.151674 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.153224 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.172622 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6"] Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287547 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287658 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n79tg\" (UniqueName: \"kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287707 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287844 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287883 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.287916 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389496 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389601 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n79tg\" (UniqueName: \"kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389662 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389866 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389907 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.389952 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.390607 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.390777 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.392036 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.402920 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.403265 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.412110 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df2f56b1-3af2-43aa-9118-4ea47186d7c6" path="/var/lib/kubelet/pods/df2f56b1-3af2-43aa-9118-4ea47186d7c6/volumes" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.427743 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n79tg\" (UniqueName: \"kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg\") pod \"swift-ring-rebalance-debug-fgmh6\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.471780 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:06 crc kubenswrapper[4738]: I0307 07:41:06.935206 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6"] Mar 07 07:41:06 crc kubenswrapper[4738]: W0307 07:41:06.944027 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55d142bc_ba91_4f9e_a976_5c72872c8efe.slice/crio-442efd81cdbf26c00a4fa898c567596f62443d12d1f9f586676c632d975100d6 WatchSource:0}: Error finding container 442efd81cdbf26c00a4fa898c567596f62443d12d1f9f586676c632d975100d6: Status 404 returned error can't find the container with id 442efd81cdbf26c00a4fa898c567596f62443d12d1f9f586676c632d975100d6 Mar 07 07:41:07 crc kubenswrapper[4738]: I0307 07:41:07.684191 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" event={"ID":"55d142bc-ba91-4f9e-a976-5c72872c8efe","Type":"ContainerStarted","Data":"05c6c33c5aaee078fd8bb83c9669c365ad1f0dbe219cbbe3fbeea24e4a36377c"} Mar 07 07:41:07 crc kubenswrapper[4738]: I0307 07:41:07.684470 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" event={"ID":"55d142bc-ba91-4f9e-a976-5c72872c8efe","Type":"ContainerStarted","Data":"442efd81cdbf26c00a4fa898c567596f62443d12d1f9f586676c632d975100d6"} Mar 07 07:41:07 crc kubenswrapper[4738]: I0307 07:41:07.713905 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" podStartSLOduration=1.71387293 podStartE2EDuration="1.71387293s" podCreationTimestamp="2026-03-07 07:41:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:07.705919677 +0000 UTC m=+2486.170907008" watchObservedRunningTime="2026-03-07 07:41:07.71387293 +0000 UTC m=+2486.178860261" Mar 07 07:41:08 crc kubenswrapper[4738]: I0307 07:41:08.698426 4738 generic.go:334] "Generic (PLEG): container finished" podID="55d142bc-ba91-4f9e-a976-5c72872c8efe" containerID="05c6c33c5aaee078fd8bb83c9669c365ad1f0dbe219cbbe3fbeea24e4a36377c" exitCode=0 Mar 07 07:41:08 crc kubenswrapper[4738]: I0307 07:41:08.698481 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" event={"ID":"55d142bc-ba91-4f9e-a976-5c72872c8efe","Type":"ContainerDied","Data":"05c6c33c5aaee078fd8bb83c9669c365ad1f0dbe219cbbe3fbeea24e4a36377c"} Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.036537 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.070946 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6"] Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.077001 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6"] Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.150991 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151045 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n79tg\" (UniqueName: \"kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151083 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151169 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151279 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151305 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift\") pod \"55d142bc-ba91-4f9e-a976-5c72872c8efe\" (UID: \"55d142bc-ba91-4f9e-a976-5c72872c8efe\") " Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.151743 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.152777 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.157203 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg" (OuterVolumeSpecName: "kube-api-access-n79tg") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "kube-api-access-n79tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.173762 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.182611 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts" (OuterVolumeSpecName: "scripts") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.182941 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "55d142bc-ba91-4f9e-a976-5c72872c8efe" (UID: "55d142bc-ba91-4f9e-a976-5c72872c8efe"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.253959 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.254188 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55d142bc-ba91-4f9e-a976-5c72872c8efe-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.254303 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.254370 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n79tg\" (UniqueName: \"kubernetes.io/projected/55d142bc-ba91-4f9e-a976-5c72872c8efe-kube-api-access-n79tg\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.254432 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55d142bc-ba91-4f9e-a976-5c72872c8efe-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.254501 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55d142bc-ba91-4f9e-a976-5c72872c8efe-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.401113 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55d142bc-ba91-4f9e-a976-5c72872c8efe" path="/var/lib/kubelet/pods/55d142bc-ba91-4f9e-a976-5c72872c8efe/volumes" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.723271 4738 scope.go:117] "RemoveContainer" containerID="05c6c33c5aaee078fd8bb83c9669c365ad1f0dbe219cbbe3fbeea24e4a36377c" Mar 07 07:41:10 crc kubenswrapper[4738]: I0307 07:41:10.723366 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-fgmh6" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.220887 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95rwd"] Mar 07 07:41:11 crc kubenswrapper[4738]: E0307 07:41:11.221523 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55d142bc-ba91-4f9e-a976-5c72872c8efe" containerName="swift-ring-rebalance" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.221560 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="55d142bc-ba91-4f9e-a976-5c72872c8efe" containerName="swift-ring-rebalance" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.221957 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="55d142bc-ba91-4f9e-a976-5c72872c8efe" containerName="swift-ring-rebalance" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.223039 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.226052 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.226689 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.238195 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95rwd"] Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.373195 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qmc7\" (UniqueName: \"kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.373525 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.373653 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.373784 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.373912 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.374035 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475572 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475673 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475714 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475741 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475771 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.475798 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qmc7\" (UniqueName: \"kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.476408 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.476647 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.476661 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.480899 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.481269 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.500857 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qmc7\" (UniqueName: \"kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7\") pod \"swift-ring-rebalance-debug-95rwd\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.550488 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:11 crc kubenswrapper[4738]: I0307 07:41:11.974913 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95rwd"] Mar 07 07:41:12 crc kubenswrapper[4738]: I0307 07:41:12.748753 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" event={"ID":"48442aba-0d49-4542-9e61-f27f7476fc3f","Type":"ContainerStarted","Data":"93997d9d5fc597655aaee18b2640ecc848ecc8938fc2417e886114a31e1f9b73"} Mar 07 07:41:12 crc kubenswrapper[4738]: I0307 07:41:12.749003 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" event={"ID":"48442aba-0d49-4542-9e61-f27f7476fc3f","Type":"ContainerStarted","Data":"2f8912328953df1567f84df8e2c807fd9d370e81a083bc47cad1904a841f7e36"} Mar 07 07:41:12 crc kubenswrapper[4738]: I0307 07:41:12.774106 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" podStartSLOduration=1.774080642 podStartE2EDuration="1.774080642s" podCreationTimestamp="2026-03-07 07:41:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:12.766769837 +0000 UTC m=+2491.231757188" watchObservedRunningTime="2026-03-07 07:41:12.774080642 +0000 UTC m=+2491.239067973" Mar 07 07:41:13 crc kubenswrapper[4738]: I0307 07:41:13.759210 4738 generic.go:334] "Generic (PLEG): container finished" podID="48442aba-0d49-4542-9e61-f27f7476fc3f" containerID="93997d9d5fc597655aaee18b2640ecc848ecc8938fc2417e886114a31e1f9b73" exitCode=0 Mar 07 07:41:13 crc kubenswrapper[4738]: I0307 07:41:13.759312 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" event={"ID":"48442aba-0d49-4542-9e61-f27f7476fc3f","Type":"ContainerDied","Data":"93997d9d5fc597655aaee18b2640ecc848ecc8938fc2417e886114a31e1f9b73"} Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.054513 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.106302 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95rwd"] Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.115702 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-95rwd"] Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233576 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233695 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233755 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qmc7\" (UniqueName: \"kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233825 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233877 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.233906 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts\") pod \"48442aba-0d49-4542-9e61-f27f7476fc3f\" (UID: \"48442aba-0d49-4542-9e61-f27f7476fc3f\") " Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.234595 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.234596 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.239568 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7" (OuterVolumeSpecName: "kube-api-access-6qmc7") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "kube-api-access-6qmc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.255229 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.256023 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.256566 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts" (OuterVolumeSpecName: "scripts") pod "48442aba-0d49-4542-9e61-f27f7476fc3f" (UID: "48442aba-0d49-4542-9e61-f27f7476fc3f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335859 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qmc7\" (UniqueName: \"kubernetes.io/projected/48442aba-0d49-4542-9e61-f27f7476fc3f-kube-api-access-6qmc7\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335896 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/48442aba-0d49-4542-9e61-f27f7476fc3f-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335920 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335929 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48442aba-0d49-4542-9e61-f27f7476fc3f-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335939 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.335946 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/48442aba-0d49-4542-9e61-f27f7476fc3f-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.781430 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f8912328953df1567f84df8e2c807fd9d370e81a083bc47cad1904a841f7e36" Mar 07 07:41:15 crc kubenswrapper[4738]: I0307 07:41:15.781492 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-95rwd" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.306478 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mdslg"] Mar 07 07:41:16 crc kubenswrapper[4738]: E0307 07:41:16.307361 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48442aba-0d49-4542-9e61-f27f7476fc3f" containerName="swift-ring-rebalance" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.307385 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="48442aba-0d49-4542-9e61-f27f7476fc3f" containerName="swift-ring-rebalance" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.307666 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="48442aba-0d49-4542-9e61-f27f7476fc3f" containerName="swift-ring-rebalance" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.308545 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.311599 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.311890 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.317620 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mdslg"] Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.396984 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48442aba-0d49-4542-9e61-f27f7476fc3f" path="/var/lib/kubelet/pods/48442aba-0d49-4542-9e61-f27f7476fc3f/volumes" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452199 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452259 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452306 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452342 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452409 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.452442 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsrgp\" (UniqueName: \"kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.553676 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.553754 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsrgp\" (UniqueName: \"kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.553862 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.553885 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.553957 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.554017 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.555230 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.555420 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.556225 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.560474 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.567409 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.570327 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsrgp\" (UniqueName: \"kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp\") pod \"swift-ring-rebalance-debug-mdslg\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:16 crc kubenswrapper[4738]: I0307 07:41:16.632794 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:17 crc kubenswrapper[4738]: I0307 07:41:17.099043 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mdslg"] Mar 07 07:41:17 crc kubenswrapper[4738]: I0307 07:41:17.818121 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" event={"ID":"7d66752d-c908-41ae-918a-3ed911c9950c","Type":"ContainerStarted","Data":"828b059b4cf57cc0664f28aa94eea6be2123bc33d6fedb24cc6a583813fd34d7"} Mar 07 07:41:17 crc kubenswrapper[4738]: I0307 07:41:17.818468 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" event={"ID":"7d66752d-c908-41ae-918a-3ed911c9950c","Type":"ContainerStarted","Data":"bf55c56c8ed3fd4e9210937ef88127eceac0069c370abfe88a03a806bf32b405"} Mar 07 07:41:17 crc kubenswrapper[4738]: I0307 07:41:17.854552 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" podStartSLOduration=1.8545275079999999 podStartE2EDuration="1.854527508s" podCreationTimestamp="2026-03-07 07:41:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:17.838841577 +0000 UTC m=+2496.303828938" watchObservedRunningTime="2026-03-07 07:41:17.854527508 +0000 UTC m=+2496.319514849" Mar 07 07:41:18 crc kubenswrapper[4738]: I0307 07:41:18.828946 4738 generic.go:334] "Generic (PLEG): container finished" podID="7d66752d-c908-41ae-918a-3ed911c9950c" containerID="828b059b4cf57cc0664f28aa94eea6be2123bc33d6fedb24cc6a583813fd34d7" exitCode=0 Mar 07 07:41:18 crc kubenswrapper[4738]: I0307 07:41:18.829148 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" event={"ID":"7d66752d-c908-41ae-918a-3ed911c9950c","Type":"ContainerDied","Data":"828b059b4cf57cc0664f28aa94eea6be2123bc33d6fedb24cc6a583813fd34d7"} Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.114643 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.145391 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mdslg"] Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.149136 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-mdslg"] Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220043 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsrgp\" (UniqueName: \"kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220236 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220287 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220333 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220357 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.220388 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift\") pod \"7d66752d-c908-41ae-918a-3ed911c9950c\" (UID: \"7d66752d-c908-41ae-918a-3ed911c9950c\") " Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.221099 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.221689 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.230000 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp" (OuterVolumeSpecName: "kube-api-access-gsrgp") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "kube-api-access-gsrgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.240591 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.241067 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts" (OuterVolumeSpecName: "scripts") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.247644 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7d66752d-c908-41ae-918a-3ed911c9950c" (UID: "7d66752d-c908-41ae-918a-3ed911c9950c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322089 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsrgp\" (UniqueName: \"kubernetes.io/projected/7d66752d-c908-41ae-918a-3ed911c9950c-kube-api-access-gsrgp\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322182 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322196 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322209 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d66752d-c908-41ae-918a-3ed911c9950c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322220 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7d66752d-c908-41ae-918a-3ed911c9950c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.322230 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7d66752d-c908-41ae-918a-3ed911c9950c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.395702 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d66752d-c908-41ae-918a-3ed911c9950c" path="/var/lib/kubelet/pods/7d66752d-c908-41ae-918a-3ed911c9950c/volumes" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.845107 4738 scope.go:117] "RemoveContainer" containerID="828b059b4cf57cc0664f28aa94eea6be2123bc33d6fedb24cc6a583813fd34d7" Mar 07 07:41:20 crc kubenswrapper[4738]: I0307 07:41:20.845146 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-mdslg" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.363861 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz"] Mar 07 07:41:21 crc kubenswrapper[4738]: E0307 07:41:21.364425 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d66752d-c908-41ae-918a-3ed911c9950c" containerName="swift-ring-rebalance" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.364450 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d66752d-c908-41ae-918a-3ed911c9950c" containerName="swift-ring-rebalance" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.364666 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d66752d-c908-41ae-918a-3ed911c9950c" containerName="swift-ring-rebalance" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.365289 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.370073 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.370481 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.370950 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz"] Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539342 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539459 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539557 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539637 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztsb7\" (UniqueName: \"kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539677 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.539694 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641121 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641248 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641301 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztsb7\" (UniqueName: \"kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641336 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641361 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.641409 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.642143 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.643047 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.643130 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.650375 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.655680 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.666087 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztsb7\" (UniqueName: \"kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7\") pod \"swift-ring-rebalance-debug-4f2pz\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:21 crc kubenswrapper[4738]: I0307 07:41:21.693687 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:22 crc kubenswrapper[4738]: I0307 07:41:22.124931 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz"] Mar 07 07:41:22 crc kubenswrapper[4738]: I0307 07:41:22.868133 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" event={"ID":"bed262ae-834f-4b7c-b006-4729068e1ea1","Type":"ContainerStarted","Data":"0605d2b50860efebc197652f404ef9b74dbb5d41b438c852bb1d892ad668ce65"} Mar 07 07:41:22 crc kubenswrapper[4738]: I0307 07:41:22.868447 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" event={"ID":"bed262ae-834f-4b7c-b006-4729068e1ea1","Type":"ContainerStarted","Data":"169d14e295e8b3a7a424b99313e72c7d5e2519e16b9588b225a0164d76a5d4bd"} Mar 07 07:41:23 crc kubenswrapper[4738]: I0307 07:41:23.885289 4738 generic.go:334] "Generic (PLEG): container finished" podID="bed262ae-834f-4b7c-b006-4729068e1ea1" containerID="0605d2b50860efebc197652f404ef9b74dbb5d41b438c852bb1d892ad668ce65" exitCode=0 Mar 07 07:41:23 crc kubenswrapper[4738]: I0307 07:41:23.885351 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" event={"ID":"bed262ae-834f-4b7c-b006-4729068e1ea1","Type":"ContainerDied","Data":"0605d2b50860efebc197652f404ef9b74dbb5d41b438c852bb1d892ad668ce65"} Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.114213 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.146520 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz"] Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.156017 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz"] Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221649 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221713 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221778 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221864 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221918 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztsb7\" (UniqueName: \"kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.221950 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf\") pod \"bed262ae-834f-4b7c-b006-4729068e1ea1\" (UID: \"bed262ae-834f-4b7c-b006-4729068e1ea1\") " Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.222778 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.223220 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.227780 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7" (OuterVolumeSpecName: "kube-api-access-ztsb7") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "kube-api-access-ztsb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.242244 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts" (OuterVolumeSpecName: "scripts") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.246442 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.246976 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bed262ae-834f-4b7c-b006-4729068e1ea1" (UID: "bed262ae-834f-4b7c-b006-4729068e1ea1"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323315 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323341 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztsb7\" (UniqueName: \"kubernetes.io/projected/bed262ae-834f-4b7c-b006-4729068e1ea1-kube-api-access-ztsb7\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323352 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323360 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bed262ae-834f-4b7c-b006-4729068e1ea1-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323369 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bed262ae-834f-4b7c-b006-4729068e1ea1-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.323377 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bed262ae-834f-4b7c-b006-4729068e1ea1-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.903111 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="169d14e295e8b3a7a424b99313e72c7d5e2519e16b9588b225a0164d76a5d4bd" Mar 07 07:41:25 crc kubenswrapper[4738]: I0307 07:41:25.903216 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-4f2pz" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.346374 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq"] Mar 07 07:41:26 crc kubenswrapper[4738]: E0307 07:41:26.346670 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bed262ae-834f-4b7c-b006-4729068e1ea1" containerName="swift-ring-rebalance" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.346685 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="bed262ae-834f-4b7c-b006-4729068e1ea1" containerName="swift-ring-rebalance" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.346880 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="bed262ae-834f-4b7c-b006-4729068e1ea1" containerName="swift-ring-rebalance" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.347533 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.350606 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.350657 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.357143 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq"] Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.396687 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bed262ae-834f-4b7c-b006-4729068e1ea1" path="/var/lib/kubelet/pods/bed262ae-834f-4b7c-b006-4729068e1ea1/volumes" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.439854 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.439914 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.439934 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.439952 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.440065 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.440201 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvkbs\" (UniqueName: \"kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541387 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541433 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541453 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541478 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541499 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvkbs\" (UniqueName: \"kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541562 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.541888 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.542355 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.542397 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.546318 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.547876 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.560682 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvkbs\" (UniqueName: \"kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs\") pod \"swift-ring-rebalance-debug-l2cqq\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.665367 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.958277 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.958639 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.958683 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.959266 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:41:26 crc kubenswrapper[4738]: I0307 07:41:26.959331 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" gracePeriod=600 Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.092673 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq"] Mar 07 07:41:27 crc kubenswrapper[4738]: E0307 07:41:27.114692 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.923703 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" exitCode=0 Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.923799 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee"} Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.923871 4738 scope.go:117] "RemoveContainer" containerID="3f2f40850859badc575bb7ed1a011243a7dda83e8be3d1302a68de4787f67eff" Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.924607 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:41:27 crc kubenswrapper[4738]: E0307 07:41:27.925027 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.925673 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" event={"ID":"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978","Type":"ContainerStarted","Data":"88ba929bfd6136291907134ecdf65bef9c422f65ed79cacca79fd3a991206811"} Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.925701 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" event={"ID":"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978","Type":"ContainerStarted","Data":"8c42b93c6c39cd13b0a324fc88c4e8d4f4591e5c1e129f92e20edd5971d0354b"} Mar 07 07:41:27 crc kubenswrapper[4738]: I0307 07:41:27.974983 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" podStartSLOduration=1.974964002 podStartE2EDuration="1.974964002s" podCreationTimestamp="2026-03-07 07:41:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:27.972970629 +0000 UTC m=+2506.437958000" watchObservedRunningTime="2026-03-07 07:41:27.974964002 +0000 UTC m=+2506.439951343" Mar 07 07:41:28 crc kubenswrapper[4738]: I0307 07:41:28.938722 4738 generic.go:334] "Generic (PLEG): container finished" podID="8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" containerID="88ba929bfd6136291907134ecdf65bef9c422f65ed79cacca79fd3a991206811" exitCode=0 Mar 07 07:41:28 crc kubenswrapper[4738]: I0307 07:41:28.938809 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" event={"ID":"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978","Type":"ContainerDied","Data":"88ba929bfd6136291907134ecdf65bef9c422f65ed79cacca79fd3a991206811"} Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.311636 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.339576 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq"] Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.372927 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq"] Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512723 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512776 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvkbs\" (UniqueName: \"kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512864 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512925 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512962 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.512996 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift\") pod \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\" (UID: \"8a525fe7-0e6a-4ea9-9891-9c4cbacc6978\") " Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.513788 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.513961 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.521492 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs" (OuterVolumeSpecName: "kube-api-access-vvkbs") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "kube-api-access-vvkbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.535170 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts" (OuterVolumeSpecName: "scripts") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.538682 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.538860 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" (UID: "8a525fe7-0e6a-4ea9-9891-9c4cbacc6978"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.614869 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.615134 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.615222 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.615291 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.615374 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.615460 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvkbs\" (UniqueName: \"kubernetes.io/projected/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978-kube-api-access-vvkbs\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.961785 4738 scope.go:117] "RemoveContainer" containerID="88ba929bfd6136291907134ecdf65bef9c422f65ed79cacca79fd3a991206811" Mar 07 07:41:30 crc kubenswrapper[4738]: I0307 07:41:30.961850 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-l2cqq" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.511047 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t"] Mar 07 07:41:31 crc kubenswrapper[4738]: E0307 07:41:31.511645 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" containerName="swift-ring-rebalance" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.511679 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" containerName="swift-ring-rebalance" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.512014 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" containerName="swift-ring-rebalance" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.513099 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.515101 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.515906 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.529649 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t"] Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.530411 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.530564 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrv77\" (UniqueName: \"kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.530661 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.530709 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.530915 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.531001 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.631909 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632268 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632313 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632362 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrv77\" (UniqueName: \"kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632405 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632427 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632607 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.632608 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.633133 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.635635 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.635659 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.647752 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrv77\" (UniqueName: \"kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77\") pod \"swift-ring-rebalance-debug-jhh9t\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:31 crc kubenswrapper[4738]: I0307 07:41:31.874736 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:32 crc kubenswrapper[4738]: I0307 07:41:32.362465 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t"] Mar 07 07:41:32 crc kubenswrapper[4738]: I0307 07:41:32.398745 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a525fe7-0e6a-4ea9-9891-9c4cbacc6978" path="/var/lib/kubelet/pods/8a525fe7-0e6a-4ea9-9891-9c4cbacc6978/volumes" Mar 07 07:41:32 crc kubenswrapper[4738]: I0307 07:41:32.984189 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" event={"ID":"53f6f681-8c18-4c73-a5c1-e097888b4d17","Type":"ContainerStarted","Data":"cec6a0ef6754c78b76ee26c02e2903841b1c101f43bd2ea7e217a3f5e5f7561b"} Mar 07 07:41:32 crc kubenswrapper[4738]: I0307 07:41:32.984502 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" event={"ID":"53f6f681-8c18-4c73-a5c1-e097888b4d17","Type":"ContainerStarted","Data":"df78ea42fc0dbd164e4fe49c187221ae20aab532c4a3e7500eb30fc8a35295ec"} Mar 07 07:41:33 crc kubenswrapper[4738]: I0307 07:41:33.027360 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" podStartSLOduration=2.027339805 podStartE2EDuration="2.027339805s" podCreationTimestamp="2026-03-07 07:41:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:33.022233778 +0000 UTC m=+2511.487221109" watchObservedRunningTime="2026-03-07 07:41:33.027339805 +0000 UTC m=+2511.492327126" Mar 07 07:41:34 crc kubenswrapper[4738]: I0307 07:41:34.007021 4738 generic.go:334] "Generic (PLEG): container finished" podID="53f6f681-8c18-4c73-a5c1-e097888b4d17" containerID="cec6a0ef6754c78b76ee26c02e2903841b1c101f43bd2ea7e217a3f5e5f7561b" exitCode=0 Mar 07 07:41:34 crc kubenswrapper[4738]: I0307 07:41:34.007090 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" event={"ID":"53f6f681-8c18-4c73-a5c1-e097888b4d17","Type":"ContainerDied","Data":"cec6a0ef6754c78b76ee26c02e2903841b1c101f43bd2ea7e217a3f5e5f7561b"} Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.385374 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.421992 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t"] Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.428779 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t"] Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.505823 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.505915 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.505955 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.505998 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.506040 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrv77\" (UniqueName: \"kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.506125 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts\") pod \"53f6f681-8c18-4c73-a5c1-e097888b4d17\" (UID: \"53f6f681-8c18-4c73-a5c1-e097888b4d17\") " Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.506959 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.507530 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.511734 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77" (OuterVolumeSpecName: "kube-api-access-zrv77") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "kube-api-access-zrv77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.527612 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts" (OuterVolumeSpecName: "scripts") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.533497 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.545477 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "53f6f681-8c18-4c73-a5c1-e097888b4d17" (UID: "53f6f681-8c18-4c73-a5c1-e097888b4d17"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608492 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608546 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/53f6f681-8c18-4c73-a5c1-e097888b4d17-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608567 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/53f6f681-8c18-4c73-a5c1-e097888b4d17-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608584 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608602 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrv77\" (UniqueName: \"kubernetes.io/projected/53f6f681-8c18-4c73-a5c1-e097888b4d17-kube-api-access-zrv77\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:35 crc kubenswrapper[4738]: I0307 07:41:35.608621 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53f6f681-8c18-4c73-a5c1-e097888b4d17-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.041277 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df78ea42fc0dbd164e4fe49c187221ae20aab532c4a3e7500eb30fc8a35295ec" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.041331 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-jhh9t" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.393864 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53f6f681-8c18-4c73-a5c1-e097888b4d17" path="/var/lib/kubelet/pods/53f6f681-8c18-4c73-a5c1-e097888b4d17/volumes" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.616866 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z25p"] Mar 07 07:41:36 crc kubenswrapper[4738]: E0307 07:41:36.617215 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53f6f681-8c18-4c73-a5c1-e097888b4d17" containerName="swift-ring-rebalance" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.617233 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="53f6f681-8c18-4c73-a5c1-e097888b4d17" containerName="swift-ring-rebalance" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.617450 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="53f6f681-8c18-4c73-a5c1-e097888b4d17" containerName="swift-ring-rebalance" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.618101 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.624025 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z25p"] Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.624846 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.625891 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723429 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h682n\" (UniqueName: \"kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723709 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723743 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723770 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723801 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.723818 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825016 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825074 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825148 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h682n\" (UniqueName: \"kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825227 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825274 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.825325 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.826489 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.826548 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.826729 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.831656 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.831728 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.848520 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h682n\" (UniqueName: \"kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n\") pod \"swift-ring-rebalance-debug-7z25p\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:36 crc kubenswrapper[4738]: I0307 07:41:36.936119 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:37 crc kubenswrapper[4738]: I0307 07:41:37.132620 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z25p"] Mar 07 07:41:37 crc kubenswrapper[4738]: W0307 07:41:37.142188 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb587c7d9_2ad8_494a_b471_1593c91f72f3.slice/crio-a25118fc356c60285eb8f54ff17cd4a7ec9c33a231ca2de51acf0c86f031398d WatchSource:0}: Error finding container a25118fc356c60285eb8f54ff17cd4a7ec9c33a231ca2de51acf0c86f031398d: Status 404 returned error can't find the container with id a25118fc356c60285eb8f54ff17cd4a7ec9c33a231ca2de51acf0c86f031398d Mar 07 07:41:38 crc kubenswrapper[4738]: I0307 07:41:38.063799 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" event={"ID":"b587c7d9-2ad8-494a-b471-1593c91f72f3","Type":"ContainerStarted","Data":"f34ad99a874273df7f08265b145c56f9ef18864f84eccea2b3f72bbe7a780f4d"} Mar 07 07:41:38 crc kubenswrapper[4738]: I0307 07:41:38.064150 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" event={"ID":"b587c7d9-2ad8-494a-b471-1593c91f72f3","Type":"ContainerStarted","Data":"a25118fc356c60285eb8f54ff17cd4a7ec9c33a231ca2de51acf0c86f031398d"} Mar 07 07:41:38 crc kubenswrapper[4738]: I0307 07:41:38.085525 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" podStartSLOduration=2.085509183 podStartE2EDuration="2.085509183s" podCreationTimestamp="2026-03-07 07:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:38.081117955 +0000 UTC m=+2516.546105296" watchObservedRunningTime="2026-03-07 07:41:38.085509183 +0000 UTC m=+2516.550496504" Mar 07 07:41:39 crc kubenswrapper[4738]: I0307 07:41:39.073085 4738 generic.go:334] "Generic (PLEG): container finished" podID="b587c7d9-2ad8-494a-b471-1593c91f72f3" containerID="f34ad99a874273df7f08265b145c56f9ef18864f84eccea2b3f72bbe7a780f4d" exitCode=0 Mar 07 07:41:39 crc kubenswrapper[4738]: I0307 07:41:39.073144 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" event={"ID":"b587c7d9-2ad8-494a-b471-1593c91f72f3","Type":"ContainerDied","Data":"f34ad99a874273df7f08265b145c56f9ef18864f84eccea2b3f72bbe7a780f4d"} Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.371027 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.385424 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:41:40 crc kubenswrapper[4738]: E0307 07:41:40.385817 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.407186 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z25p"] Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.413045 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-7z25p"] Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491064 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491220 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491268 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491310 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491438 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.491495 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h682n\" (UniqueName: \"kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n\") pod \"b587c7d9-2ad8-494a-b471-1593c91f72f3\" (UID: \"b587c7d9-2ad8-494a-b471-1593c91f72f3\") " Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.492620 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.492765 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.502516 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n" (OuterVolumeSpecName: "kube-api-access-h682n") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "kube-api-access-h682n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.515976 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts" (OuterVolumeSpecName: "scripts") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.518217 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.519699 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b587c7d9-2ad8-494a-b471-1593c91f72f3" (UID: "b587c7d9-2ad8-494a-b471-1593c91f72f3"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593102 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593149 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b587c7d9-2ad8-494a-b471-1593c91f72f3-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593181 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593193 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b587c7d9-2ad8-494a-b471-1593c91f72f3-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593205 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b587c7d9-2ad8-494a-b471-1593c91f72f3-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:40 crc kubenswrapper[4738]: I0307 07:41:40.593217 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h682n\" (UniqueName: \"kubernetes.io/projected/b587c7d9-2ad8-494a-b471-1593c91f72f3-kube-api-access-h682n\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.091393 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a25118fc356c60285eb8f54ff17cd4a7ec9c33a231ca2de51acf0c86f031398d" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.091455 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-7z25p" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.535114 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rdngk"] Mar 07 07:41:41 crc kubenswrapper[4738]: E0307 07:41:41.535932 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b587c7d9-2ad8-494a-b471-1593c91f72f3" containerName="swift-ring-rebalance" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.535956 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="b587c7d9-2ad8-494a-b471-1593c91f72f3" containerName="swift-ring-rebalance" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.536235 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="b587c7d9-2ad8-494a-b471-1593c91f72f3" containerName="swift-ring-rebalance" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.536963 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.542474 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.542944 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.544446 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rdngk"] Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608725 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608777 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608795 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608826 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608844 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g5vc\" (UniqueName: \"kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.608869 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.710917 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711001 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711038 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711102 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g5vc\" (UniqueName: \"kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711135 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711505 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.711967 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.712548 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.712662 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.721204 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.721299 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.735064 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g5vc\" (UniqueName: \"kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc\") pod \"swift-ring-rebalance-debug-rdngk\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:41 crc kubenswrapper[4738]: I0307 07:41:41.851671 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:42 crc kubenswrapper[4738]: I0307 07:41:42.058094 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rdngk"] Mar 07 07:41:42 crc kubenswrapper[4738]: W0307 07:41:42.072780 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1bdf967d_2c32_445c_85ec_a6c20a4cb444.slice/crio-c7697bd41757ab263b14363f72420d65f81cae196ba2494c48fdd2994a962828 WatchSource:0}: Error finding container c7697bd41757ab263b14363f72420d65f81cae196ba2494c48fdd2994a962828: Status 404 returned error can't find the container with id c7697bd41757ab263b14363f72420d65f81cae196ba2494c48fdd2994a962828 Mar 07 07:41:42 crc kubenswrapper[4738]: I0307 07:41:42.099868 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" event={"ID":"1bdf967d-2c32-445c-85ec-a6c20a4cb444","Type":"ContainerStarted","Data":"c7697bd41757ab263b14363f72420d65f81cae196ba2494c48fdd2994a962828"} Mar 07 07:41:42 crc kubenswrapper[4738]: I0307 07:41:42.403662 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b587c7d9-2ad8-494a-b471-1593c91f72f3" path="/var/lib/kubelet/pods/b587c7d9-2ad8-494a-b471-1593c91f72f3/volumes" Mar 07 07:41:43 crc kubenswrapper[4738]: I0307 07:41:43.112213 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" event={"ID":"1bdf967d-2c32-445c-85ec-a6c20a4cb444","Type":"ContainerStarted","Data":"82670cda4a5da5bd69a732e07039acbd39db98f89ace22d01c5252fb0885d12d"} Mar 07 07:41:43 crc kubenswrapper[4738]: I0307 07:41:43.147366 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" podStartSLOduration=2.147346739 podStartE2EDuration="2.147346739s" podCreationTimestamp="2026-03-07 07:41:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:43.130101786 +0000 UTC m=+2521.595089137" watchObservedRunningTime="2026-03-07 07:41:43.147346739 +0000 UTC m=+2521.612334060" Mar 07 07:41:44 crc kubenswrapper[4738]: I0307 07:41:44.121456 4738 generic.go:334] "Generic (PLEG): container finished" podID="1bdf967d-2c32-445c-85ec-a6c20a4cb444" containerID="82670cda4a5da5bd69a732e07039acbd39db98f89ace22d01c5252fb0885d12d" exitCode=0 Mar 07 07:41:44 crc kubenswrapper[4738]: I0307 07:41:44.121498 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" event={"ID":"1bdf967d-2c32-445c-85ec-a6c20a4cb444","Type":"ContainerDied","Data":"82670cda4a5da5bd69a732e07039acbd39db98f89ace22d01c5252fb0885d12d"} Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.430101 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.461257 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rdngk"] Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.466040 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-rdngk"] Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.574815 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.574912 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7g5vc\" (UniqueName: \"kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.574977 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.575003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.575085 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.575101 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices\") pod \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\" (UID: \"1bdf967d-2c32-445c-85ec-a6c20a4cb444\") " Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.575694 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.575834 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.585087 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc" (OuterVolumeSpecName: "kube-api-access-7g5vc") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "kube-api-access-7g5vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.607329 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts" (OuterVolumeSpecName: "scripts") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.608142 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.617226 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "1bdf967d-2c32-445c-85ec-a6c20a4cb444" (UID: "1bdf967d-2c32-445c-85ec-a6c20a4cb444"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676239 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676274 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/1bdf967d-2c32-445c-85ec-a6c20a4cb444-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676285 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676296 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/1bdf967d-2c32-445c-85ec-a6c20a4cb444-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676305 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/1bdf967d-2c32-445c-85ec-a6c20a4cb444-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:45 crc kubenswrapper[4738]: I0307 07:41:45.676314 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7g5vc\" (UniqueName: \"kubernetes.io/projected/1bdf967d-2c32-445c-85ec-a6c20a4cb444-kube-api-access-7g5vc\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.137193 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c7697bd41757ab263b14363f72420d65f81cae196ba2494c48fdd2994a962828" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.137259 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-rdngk" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.403278 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bdf967d-2c32-445c-85ec-a6c20a4cb444" path="/var/lib/kubelet/pods/1bdf967d-2c32-445c-85ec-a6c20a4cb444/volumes" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.611097 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c"] Mar 07 07:41:46 crc kubenswrapper[4738]: E0307 07:41:46.611754 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bdf967d-2c32-445c-85ec-a6c20a4cb444" containerName="swift-ring-rebalance" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.611780 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bdf967d-2c32-445c-85ec-a6c20a4cb444" containerName="swift-ring-rebalance" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.612001 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bdf967d-2c32-445c-85ec-a6c20a4cb444" containerName="swift-ring-rebalance" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.642960 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.646204 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.647513 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.651403 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c"] Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793593 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793650 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5t9c\" (UniqueName: \"kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793695 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793716 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793749 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.793844 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895239 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895572 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895615 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895641 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895723 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.895761 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5t9c\" (UniqueName: \"kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.896299 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.896732 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.896764 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.905845 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.905884 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.912907 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5t9c\" (UniqueName: \"kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c\") pod \"swift-ring-rebalance-debug-9qx8c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:46 crc kubenswrapper[4738]: I0307 07:41:46.971449 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:47 crc kubenswrapper[4738]: I0307 07:41:47.411979 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c"] Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.160539 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" event={"ID":"39b5eb47-5db2-4930-a3f4-15269d767b0c","Type":"ContainerStarted","Data":"8e099504f7ca0eaf8d64b32848961c4b8c74ca54eed1f418d78ef2964fab477f"} Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.160930 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" event={"ID":"39b5eb47-5db2-4930-a3f4-15269d767b0c","Type":"ContainerStarted","Data":"bda5fd32a94a0ebfaa4d82081f2961efe402c60e07bdfe143e00c73d2c8d44ad"} Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.186950 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" podStartSLOduration=2.186930907 podStartE2EDuration="2.186930907s" podCreationTimestamp="2026-03-07 07:41:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:48.182826028 +0000 UTC m=+2526.647813419" watchObservedRunningTime="2026-03-07 07:41:48.186930907 +0000 UTC m=+2526.651918248" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.244031 4738 scope.go:117] "RemoveContainer" containerID="713b29849e2af4600c1bdfcd486d63bdb05382be1d857b895deac05cfc98158d" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.278425 4738 scope.go:117] "RemoveContainer" containerID="4865152f084910d04b52fcf18797127b471c68b10683c54cd7848eb6e738b288" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.332632 4738 scope.go:117] "RemoveContainer" containerID="8f2a192a2bfef3c624894e8cdb76cfbc7091a3b730fba395a0d60640f8efe5f7" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.369701 4738 scope.go:117] "RemoveContainer" containerID="925e30982c4842bce8bc171832f5a644fba761046f8d33da5ce5707e29174ca1" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.405815 4738 scope.go:117] "RemoveContainer" containerID="eff191649166959173501b3653c56a69969a882daf49d883093ced5e1624f39e" Mar 07 07:41:48 crc kubenswrapper[4738]: I0307 07:41:48.434031 4738 scope.go:117] "RemoveContainer" containerID="0c7d7cf72c7859650d11474f855544943bd4f58c0a1500d8f5204aadc80a106a" Mar 07 07:41:49 crc kubenswrapper[4738]: I0307 07:41:49.170936 4738 generic.go:334] "Generic (PLEG): container finished" podID="39b5eb47-5db2-4930-a3f4-15269d767b0c" containerID="8e099504f7ca0eaf8d64b32848961c4b8c74ca54eed1f418d78ef2964fab477f" exitCode=0 Mar 07 07:41:49 crc kubenswrapper[4738]: I0307 07:41:49.171024 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" event={"ID":"39b5eb47-5db2-4930-a3f4-15269d767b0c","Type":"ContainerDied","Data":"8e099504f7ca0eaf8d64b32848961c4b8c74ca54eed1f418d78ef2964fab477f"} Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.492056 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.526595 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c"] Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.531878 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c"] Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559462 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559565 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5t9c\" (UniqueName: \"kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559591 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559633 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559649 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.559690 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift\") pod \"39b5eb47-5db2-4930-a3f4-15269d767b0c\" (UID: \"39b5eb47-5db2-4930-a3f4-15269d767b0c\") " Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.560516 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.560613 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.574988 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c" (OuterVolumeSpecName: "kube-api-access-f5t9c") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "kube-api-access-f5t9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.582581 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts" (OuterVolumeSpecName: "scripts") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.584610 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.585310 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "39b5eb47-5db2-4930-a3f4-15269d767b0c" (UID: "39b5eb47-5db2-4930-a3f4-15269d767b0c"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661737 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5t9c\" (UniqueName: \"kubernetes.io/projected/39b5eb47-5db2-4930-a3f4-15269d767b0c-kube-api-access-f5t9c\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661780 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661791 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661831 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/39b5eb47-5db2-4930-a3f4-15269d767b0c-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661840 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/39b5eb47-5db2-4930-a3f4-15269d767b0c-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:50 crc kubenswrapper[4738]: I0307 07:41:50.661850 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/39b5eb47-5db2-4930-a3f4-15269d767b0c-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.192552 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bda5fd32a94a0ebfaa4d82081f2961efe402c60e07bdfe143e00c73d2c8d44ad" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.192642 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-9qx8c" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.718115 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54rvp"] Mar 07 07:41:51 crc kubenswrapper[4738]: E0307 07:41:51.718497 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39b5eb47-5db2-4930-a3f4-15269d767b0c" containerName="swift-ring-rebalance" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.718512 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="39b5eb47-5db2-4930-a3f4-15269d767b0c" containerName="swift-ring-rebalance" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.718678 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="39b5eb47-5db2-4930-a3f4-15269d767b0c" containerName="swift-ring-rebalance" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.719303 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.722848 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.724817 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.730992 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54rvp"] Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.779911 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.779974 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbzst\" (UniqueName: \"kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.780007 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.780060 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.780086 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.780183 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881554 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881624 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881704 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881755 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbzst\" (UniqueName: \"kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881789 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.881816 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.882066 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.883170 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.883254 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.886870 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.897041 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:51 crc kubenswrapper[4738]: I0307 07:41:51.899868 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbzst\" (UniqueName: \"kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst\") pod \"swift-ring-rebalance-debug-54rvp\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:52 crc kubenswrapper[4738]: I0307 07:41:52.037534 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:52 crc kubenswrapper[4738]: I0307 07:41:52.397592 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39b5eb47-5db2-4930-a3f4-15269d767b0c" path="/var/lib/kubelet/pods/39b5eb47-5db2-4930-a3f4-15269d767b0c/volumes" Mar 07 07:41:52 crc kubenswrapper[4738]: I0307 07:41:52.477513 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54rvp"] Mar 07 07:41:53 crc kubenswrapper[4738]: I0307 07:41:53.217465 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" event={"ID":"9838bb70-c02f-49ff-9c9c-508d3190ec27","Type":"ContainerStarted","Data":"0ad4e62b9900bd9605c15226b29c3f46304ff37e1a122f233e2bda5cdb38646b"} Mar 07 07:41:53 crc kubenswrapper[4738]: I0307 07:41:53.217826 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" event={"ID":"9838bb70-c02f-49ff-9c9c-508d3190ec27","Type":"ContainerStarted","Data":"97da220fc75d6f953c4a1b6dd16fe1651a67e0409621271c62e65193944fbd06"} Mar 07 07:41:53 crc kubenswrapper[4738]: I0307 07:41:53.241553 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" podStartSLOduration=2.24153187 podStartE2EDuration="2.24153187s" podCreationTimestamp="2026-03-07 07:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:53.232640272 +0000 UTC m=+2531.697627613" watchObservedRunningTime="2026-03-07 07:41:53.24153187 +0000 UTC m=+2531.706519201" Mar 07 07:41:54 crc kubenswrapper[4738]: I0307 07:41:54.229264 4738 generic.go:334] "Generic (PLEG): container finished" podID="9838bb70-c02f-49ff-9c9c-508d3190ec27" containerID="0ad4e62b9900bd9605c15226b29c3f46304ff37e1a122f233e2bda5cdb38646b" exitCode=0 Mar 07 07:41:54 crc kubenswrapper[4738]: I0307 07:41:54.229316 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" event={"ID":"9838bb70-c02f-49ff-9c9c-508d3190ec27","Type":"ContainerDied","Data":"0ad4e62b9900bd9605c15226b29c3f46304ff37e1a122f233e2bda5cdb38646b"} Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.385811 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:41:55 crc kubenswrapper[4738]: E0307 07:41:55.386345 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.466426 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.497775 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54rvp"] Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.504199 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-54rvp"] Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.534428 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbzst\" (UniqueName: \"kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.534632 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.534731 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.534983 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.535184 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.535343 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift\") pod \"9838bb70-c02f-49ff-9c9c-508d3190ec27\" (UID: \"9838bb70-c02f-49ff-9c9c-508d3190ec27\") " Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.535579 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.535925 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.536442 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.536468 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9838bb70-c02f-49ff-9c9c-508d3190ec27-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.544816 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst" (OuterVolumeSpecName: "kube-api-access-bbzst") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "kube-api-access-bbzst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.562510 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts" (OuterVolumeSpecName: "scripts") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.566655 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.566823 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9838bb70-c02f-49ff-9c9c-508d3190ec27" (UID: "9838bb70-c02f-49ff-9c9c-508d3190ec27"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.637454 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9838bb70-c02f-49ff-9c9c-508d3190ec27-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.637492 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.637503 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9838bb70-c02f-49ff-9c9c-508d3190ec27-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:55 crc kubenswrapper[4738]: I0307 07:41:55.637515 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bbzst\" (UniqueName: \"kubernetes.io/projected/9838bb70-c02f-49ff-9c9c-508d3190ec27-kube-api-access-bbzst\") on node \"crc\" DevicePath \"\"" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.249546 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="97da220fc75d6f953c4a1b6dd16fe1651a67e0409621271c62e65193944fbd06" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.249645 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-54rvp" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.394788 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9838bb70-c02f-49ff-9c9c-508d3190ec27" path="/var/lib/kubelet/pods/9838bb70-c02f-49ff-9c9c-508d3190ec27/volumes" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.675305 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md48n"] Mar 07 07:41:56 crc kubenswrapper[4738]: E0307 07:41:56.675692 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9838bb70-c02f-49ff-9c9c-508d3190ec27" containerName="swift-ring-rebalance" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.675718 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9838bb70-c02f-49ff-9c9c-508d3190ec27" containerName="swift-ring-rebalance" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.675991 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9838bb70-c02f-49ff-9c9c-508d3190ec27" containerName="swift-ring-rebalance" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.676757 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.678748 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.678782 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.687079 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md48n"] Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750444 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hnsk\" (UniqueName: \"kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750518 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750637 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750665 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750693 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.750718 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852273 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852332 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852370 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852396 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852424 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hnsk\" (UniqueName: \"kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.852468 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.853064 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.853263 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.853556 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.856099 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.856147 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:56 crc kubenswrapper[4738]: I0307 07:41:56.870938 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hnsk\" (UniqueName: \"kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk\") pod \"swift-ring-rebalance-debug-md48n\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:57 crc kubenswrapper[4738]: I0307 07:41:57.005800 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:41:57 crc kubenswrapper[4738]: I0307 07:41:57.427673 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md48n"] Mar 07 07:41:57 crc kubenswrapper[4738]: W0307 07:41:57.430643 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode85ed0a9_ed70_4d48_b551_24170b049295.slice/crio-d322e77beecd64c01c8e7f39b547c706956638e33410c86ba33af0810597d899 WatchSource:0}: Error finding container d322e77beecd64c01c8e7f39b547c706956638e33410c86ba33af0810597d899: Status 404 returned error can't find the container with id d322e77beecd64c01c8e7f39b547c706956638e33410c86ba33af0810597d899 Mar 07 07:41:58 crc kubenswrapper[4738]: I0307 07:41:58.267853 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" event={"ID":"e85ed0a9-ed70-4d48-b551-24170b049295","Type":"ContainerStarted","Data":"b2895047cf261d9a053be3204ca0c5595b0e12379a20c28bb71c36024f09edb5"} Mar 07 07:41:58 crc kubenswrapper[4738]: I0307 07:41:58.268144 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" event={"ID":"e85ed0a9-ed70-4d48-b551-24170b049295","Type":"ContainerStarted","Data":"d322e77beecd64c01c8e7f39b547c706956638e33410c86ba33af0810597d899"} Mar 07 07:41:58 crc kubenswrapper[4738]: I0307 07:41:58.286712 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" podStartSLOduration=2.286694749 podStartE2EDuration="2.286694749s" podCreationTimestamp="2026-03-07 07:41:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:41:58.282423284 +0000 UTC m=+2536.747410605" watchObservedRunningTime="2026-03-07 07:41:58.286694749 +0000 UTC m=+2536.751682070" Mar 07 07:41:59 crc kubenswrapper[4738]: I0307 07:41:59.276386 4738 generic.go:334] "Generic (PLEG): container finished" podID="e85ed0a9-ed70-4d48-b551-24170b049295" containerID="b2895047cf261d9a053be3204ca0c5595b0e12379a20c28bb71c36024f09edb5" exitCode=0 Mar 07 07:41:59 crc kubenswrapper[4738]: I0307 07:41:59.276433 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" event={"ID":"e85ed0a9-ed70-4d48-b551-24170b049295","Type":"ContainerDied","Data":"b2895047cf261d9a053be3204ca0c5595b0e12379a20c28bb71c36024f09edb5"} Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.136577 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547822-j4n2s"] Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.137939 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.139939 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.140068 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.140118 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.143715 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-j4n2s"] Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.196336 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mx5vd\" (UniqueName: \"kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd\") pod \"auto-csr-approver-29547822-j4n2s\" (UID: \"a15887fc-8d37-4d3f-a195-b17ecf256717\") " pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.297548 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mx5vd\" (UniqueName: \"kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd\") pod \"auto-csr-approver-29547822-j4n2s\" (UID: \"a15887fc-8d37-4d3f-a195-b17ecf256717\") " pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.319777 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mx5vd\" (UniqueName: \"kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd\") pod \"auto-csr-approver-29547822-j4n2s\" (UID: \"a15887fc-8d37-4d3f-a195-b17ecf256717\") " pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.462904 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.600272 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.636273 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md48n"] Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.641867 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-md48n"] Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704305 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hnsk\" (UniqueName: \"kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704458 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704500 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704531 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704573 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.704666 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf\") pod \"e85ed0a9-ed70-4d48-b551-24170b049295\" (UID: \"e85ed0a9-ed70-4d48-b551-24170b049295\") " Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.705284 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.705295 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.705887 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e85ed0a9-ed70-4d48-b551-24170b049295-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.705917 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.707646 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk" (OuterVolumeSpecName: "kube-api-access-6hnsk") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "kube-api-access-6hnsk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.726011 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.726551 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.730645 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts" (OuterVolumeSpecName: "scripts") pod "e85ed0a9-ed70-4d48-b551-24170b049295" (UID: "e85ed0a9-ed70-4d48-b551-24170b049295"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.807657 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.807702 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e85ed0a9-ed70-4d48-b551-24170b049295-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.807716 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e85ed0a9-ed70-4d48-b551-24170b049295-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.807732 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hnsk\" (UniqueName: \"kubernetes.io/projected/e85ed0a9-ed70-4d48-b551-24170b049295-kube-api-access-6hnsk\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:00 crc kubenswrapper[4738]: I0307 07:42:00.909402 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-j4n2s"] Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.294099 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" event={"ID":"a15887fc-8d37-4d3f-a195-b17ecf256717","Type":"ContainerStarted","Data":"f516f67f4c5e39fb29c7a4c4afc29c72a32bce141627481ba198f84f35e5e8c0"} Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.295863 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d322e77beecd64c01c8e7f39b547c706956638e33410c86ba33af0810597d899" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.295937 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-md48n" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.803841 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84v4t"] Mar 07 07:42:01 crc kubenswrapper[4738]: E0307 07:42:01.804868 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e85ed0a9-ed70-4d48-b551-24170b049295" containerName="swift-ring-rebalance" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.804884 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="e85ed0a9-ed70-4d48-b551-24170b049295" containerName="swift-ring-rebalance" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.805374 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="e85ed0a9-ed70-4d48-b551-24170b049295" containerName="swift-ring-rebalance" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.806198 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.811969 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.814491 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.817000 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84v4t"] Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.924696 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.924757 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.924792 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.924953 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.925064 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:01 crc kubenswrapper[4738]: I0307 07:42:01.925267 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42rbg\" (UniqueName: \"kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027006 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42rbg\" (UniqueName: \"kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027084 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027106 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027130 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027172 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027202 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.027598 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.028333 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.029386 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.044050 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.044092 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.050519 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42rbg\" (UniqueName: \"kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg\") pod \"swift-ring-rebalance-debug-84v4t\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.132060 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.398142 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e85ed0a9-ed70-4d48-b551-24170b049295" path="/var/lib/kubelet/pods/e85ed0a9-ed70-4d48-b551-24170b049295/volumes" Mar 07 07:42:02 crc kubenswrapper[4738]: I0307 07:42:02.612537 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84v4t"] Mar 07 07:42:02 crc kubenswrapper[4738]: W0307 07:42:02.680029 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b314261_caf1_496b_ac93_33ecefcb7c28.slice/crio-db99b3d469031e6206e892473b0066192e3885bc2f3a7c7f9d6c3aa469acdc79 WatchSource:0}: Error finding container db99b3d469031e6206e892473b0066192e3885bc2f3a7c7f9d6c3aa469acdc79: Status 404 returned error can't find the container with id db99b3d469031e6206e892473b0066192e3885bc2f3a7c7f9d6c3aa469acdc79 Mar 07 07:42:03 crc kubenswrapper[4738]: I0307 07:42:03.324014 4738 generic.go:334] "Generic (PLEG): container finished" podID="a15887fc-8d37-4d3f-a195-b17ecf256717" containerID="9ff7a0f733008bdfeefa8c03f9b720ab32f582e7b8261823a50efabe1e701523" exitCode=0 Mar 07 07:42:03 crc kubenswrapper[4738]: I0307 07:42:03.324219 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" event={"ID":"a15887fc-8d37-4d3f-a195-b17ecf256717","Type":"ContainerDied","Data":"9ff7a0f733008bdfeefa8c03f9b720ab32f582e7b8261823a50efabe1e701523"} Mar 07 07:42:03 crc kubenswrapper[4738]: I0307 07:42:03.326545 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" event={"ID":"2b314261-caf1-496b-ac93-33ecefcb7c28","Type":"ContainerStarted","Data":"dde2cbe63459b0a3d58442e5ac0df81a71cbf68b5f716d45aee18730e73f6a13"} Mar 07 07:42:03 crc kubenswrapper[4738]: I0307 07:42:03.326578 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" event={"ID":"2b314261-caf1-496b-ac93-33ecefcb7c28","Type":"ContainerStarted","Data":"db99b3d469031e6206e892473b0066192e3885bc2f3a7c7f9d6c3aa469acdc79"} Mar 07 07:42:03 crc kubenswrapper[4738]: I0307 07:42:03.365187 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" podStartSLOduration=2.365169991 podStartE2EDuration="2.365169991s" podCreationTimestamp="2026-03-07 07:42:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:42:03.363117146 +0000 UTC m=+2541.828104467" watchObservedRunningTime="2026-03-07 07:42:03.365169991 +0000 UTC m=+2541.830157312" Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.337386 4738 generic.go:334] "Generic (PLEG): container finished" podID="2b314261-caf1-496b-ac93-33ecefcb7c28" containerID="dde2cbe63459b0a3d58442e5ac0df81a71cbf68b5f716d45aee18730e73f6a13" exitCode=0 Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.337447 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" event={"ID":"2b314261-caf1-496b-ac93-33ecefcb7c28","Type":"ContainerDied","Data":"dde2cbe63459b0a3d58442e5ac0df81a71cbf68b5f716d45aee18730e73f6a13"} Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.595012 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.665251 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mx5vd\" (UniqueName: \"kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd\") pod \"a15887fc-8d37-4d3f-a195-b17ecf256717\" (UID: \"a15887fc-8d37-4d3f-a195-b17ecf256717\") " Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.671059 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd" (OuterVolumeSpecName: "kube-api-access-mx5vd") pod "a15887fc-8d37-4d3f-a195-b17ecf256717" (UID: "a15887fc-8d37-4d3f-a195-b17ecf256717"). InnerVolumeSpecName "kube-api-access-mx5vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:42:04 crc kubenswrapper[4738]: I0307 07:42:04.766876 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mx5vd\" (UniqueName: \"kubernetes.io/projected/a15887fc-8d37-4d3f-a195-b17ecf256717-kube-api-access-mx5vd\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.348595 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" event={"ID":"a15887fc-8d37-4d3f-a195-b17ecf256717","Type":"ContainerDied","Data":"f516f67f4c5e39fb29c7a4c4afc29c72a32bce141627481ba198f84f35e5e8c0"} Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.348945 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f516f67f4c5e39fb29c7a4c4afc29c72a32bce141627481ba198f84f35e5e8c0" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.348632 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547822-j4n2s" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.621516 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.653599 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84v4t"] Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.659649 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-84v4t"] Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.664889 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-nk69t"] Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.671795 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547816-nk69t"] Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.680865 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42rbg\" (UniqueName: \"kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.680940 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681026 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681069 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681092 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681228 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts\") pod \"2b314261-caf1-496b-ac93-33ecefcb7c28\" (UID: \"2b314261-caf1-496b-ac93-33ecefcb7c28\") " Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681851 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.681861 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.682325 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/2b314261-caf1-496b-ac93-33ecefcb7c28-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.682344 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.684853 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg" (OuterVolumeSpecName: "kube-api-access-42rbg") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "kube-api-access-42rbg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.697961 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts" (OuterVolumeSpecName: "scripts") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.702503 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.702538 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "2b314261-caf1-496b-ac93-33ecefcb7c28" (UID: "2b314261-caf1-496b-ac93-33ecefcb7c28"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.783644 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2b314261-caf1-496b-ac93-33ecefcb7c28-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.783879 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42rbg\" (UniqueName: \"kubernetes.io/projected/2b314261-caf1-496b-ac93-33ecefcb7c28-kube-api-access-42rbg\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.783960 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:05 crc kubenswrapper[4738]: I0307 07:42:05.784069 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/2b314261-caf1-496b-ac93-33ecefcb7c28-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.359401 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="db99b3d469031e6206e892473b0066192e3885bc2f3a7c7f9d6c3aa469acdc79" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.359489 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-84v4t" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.405882 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b314261-caf1-496b-ac93-33ecefcb7c28" path="/var/lib/kubelet/pods/2b314261-caf1-496b-ac93-33ecefcb7c28/volumes" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.406575 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c531e025-5682-49d9-9e7e-0e7191fc00a4" path="/var/lib/kubelet/pods/c531e025-5682-49d9-9e7e-0e7191fc00a4/volumes" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.814907 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dj74j"] Mar 07 07:42:06 crc kubenswrapper[4738]: E0307 07:42:06.815498 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a15887fc-8d37-4d3f-a195-b17ecf256717" containerName="oc" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.815664 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="a15887fc-8d37-4d3f-a195-b17ecf256717" containerName="oc" Mar 07 07:42:06 crc kubenswrapper[4738]: E0307 07:42:06.815802 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b314261-caf1-496b-ac93-33ecefcb7c28" containerName="swift-ring-rebalance" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.815900 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b314261-caf1-496b-ac93-33ecefcb7c28" containerName="swift-ring-rebalance" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.816227 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="a15887fc-8d37-4d3f-a195-b17ecf256717" containerName="oc" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.816342 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b314261-caf1-496b-ac93-33ecefcb7c28" containerName="swift-ring-rebalance" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.817010 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.819025 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-scripts" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.819459 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"swift-kuttl-tests"/"swift-ring-config-data" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.823233 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dj74j"] Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.899996 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.900332 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.900458 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.900605 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.900675 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:06 crc kubenswrapper[4738]: I0307 07:42:06.900822 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fngxv\" (UniqueName: \"kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002449 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002573 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fngxv\" (UniqueName: \"kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002680 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002710 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002742 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.002770 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.003435 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.003487 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.003982 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.007194 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.009625 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.033078 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fngxv\" (UniqueName: \"kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv\") pod \"swift-ring-rebalance-debug-dj74j\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.138845 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:07 crc kubenswrapper[4738]: I0307 07:42:07.571866 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dj74j"] Mar 07 07:42:07 crc kubenswrapper[4738]: W0307 07:42:07.573450 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58abbfc1_229e_4362_a0ac_b8e8efe9dd16.slice/crio-4bd09722702283d49c83fc907fed37eb8ba8e15b896e254ed242bfd93429f670 WatchSource:0}: Error finding container 4bd09722702283d49c83fc907fed37eb8ba8e15b896e254ed242bfd93429f670: Status 404 returned error can't find the container with id 4bd09722702283d49c83fc907fed37eb8ba8e15b896e254ed242bfd93429f670 Mar 07 07:42:08 crc kubenswrapper[4738]: I0307 07:42:08.413991 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" event={"ID":"58abbfc1-229e-4362-a0ac-b8e8efe9dd16","Type":"ContainerStarted","Data":"abdc557f8c4d57a924c279189a305c6e56fd31c68cad7d4b5d232ee4d344d17b"} Mar 07 07:42:08 crc kubenswrapper[4738]: I0307 07:42:08.414309 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" event={"ID":"58abbfc1-229e-4362-a0ac-b8e8efe9dd16","Type":"ContainerStarted","Data":"4bd09722702283d49c83fc907fed37eb8ba8e15b896e254ed242bfd93429f670"} Mar 07 07:42:08 crc kubenswrapper[4738]: I0307 07:42:08.429590 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" podStartSLOduration=2.429571685 podStartE2EDuration="2.429571685s" podCreationTimestamp="2026-03-07 07:42:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 07:42:08.426838482 +0000 UTC m=+2546.891825803" watchObservedRunningTime="2026-03-07 07:42:08.429571685 +0000 UTC m=+2546.894559016" Mar 07 07:42:09 crc kubenswrapper[4738]: I0307 07:42:09.416191 4738 generic.go:334] "Generic (PLEG): container finished" podID="58abbfc1-229e-4362-a0ac-b8e8efe9dd16" containerID="abdc557f8c4d57a924c279189a305c6e56fd31c68cad7d4b5d232ee4d344d17b" exitCode=0 Mar 07 07:42:09 crc kubenswrapper[4738]: I0307 07:42:09.416373 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" event={"ID":"58abbfc1-229e-4362-a0ac-b8e8efe9dd16","Type":"ContainerDied","Data":"abdc557f8c4d57a924c279189a305c6e56fd31c68cad7d4b5d232ee4d344d17b"} Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.386115 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:42:10 crc kubenswrapper[4738]: E0307 07:42:10.386365 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.722437 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756350 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756434 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756465 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756511 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fngxv\" (UniqueName: \"kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756538 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756572 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts\") pod \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\" (UID: \"58abbfc1-229e-4362-a0ac-b8e8efe9dd16\") " Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.756859 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.758100 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.761847 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dj74j"] Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.771222 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["swift-kuttl-tests/swift-ring-rebalance-debug-dj74j"] Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.772538 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv" (OuterVolumeSpecName: "kube-api-access-fngxv") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "kube-api-access-fngxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.780006 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.780442 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts" (OuterVolumeSpecName: "scripts") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.783137 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "58abbfc1-229e-4362-a0ac-b8e8efe9dd16" (UID: "58abbfc1-229e-4362-a0ac-b8e8efe9dd16"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858133 4738 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858197 4738 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858210 4738 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858220 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fngxv\" (UniqueName: \"kubernetes.io/projected/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-kube-api-access-fngxv\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858230 4738 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:10 crc kubenswrapper[4738]: I0307 07:42:10.858238 4738 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/58abbfc1-229e-4362-a0ac-b8e8efe9dd16-scripts\") on node \"crc\" DevicePath \"\"" Mar 07 07:42:11 crc kubenswrapper[4738]: I0307 07:42:11.436464 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd09722702283d49c83fc907fed37eb8ba8e15b896e254ed242bfd93429f670" Mar 07 07:42:11 crc kubenswrapper[4738]: I0307 07:42:11.436576 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="swift-kuttl-tests/swift-ring-rebalance-debug-dj74j" Mar 07 07:42:12 crc kubenswrapper[4738]: I0307 07:42:12.398135 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58abbfc1-229e-4362-a0ac-b8e8efe9dd16" path="/var/lib/kubelet/pods/58abbfc1-229e-4362-a0ac-b8e8efe9dd16/volumes" Mar 07 07:42:21 crc kubenswrapper[4738]: I0307 07:42:21.386039 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:42:21 crc kubenswrapper[4738]: E0307 07:42:21.387243 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:42:34 crc kubenswrapper[4738]: I0307 07:42:34.385204 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:42:34 crc kubenswrapper[4738]: E0307 07:42:34.385890 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.804612 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rfp5n/must-gather-7zxj8"] Mar 07 07:42:43 crc kubenswrapper[4738]: E0307 07:42:43.807016 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58abbfc1-229e-4362-a0ac-b8e8efe9dd16" containerName="swift-ring-rebalance" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.807050 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="58abbfc1-229e-4362-a0ac-b8e8efe9dd16" containerName="swift-ring-rebalance" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.807243 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="58abbfc1-229e-4362-a0ac-b8e8efe9dd16" containerName="swift-ring-rebalance" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.808220 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.810560 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-rfp5n"/"default-dockercfg-2gk8d" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.810739 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rfp5n"/"kube-root-ca.crt" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.812027 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rfp5n"/"openshift-service-ca.crt" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.880295 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rfp5n/must-gather-7zxj8"] Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.899241 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:43 crc kubenswrapper[4738]: I0307 07:42:43.899387 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rx67h\" (UniqueName: \"kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.000379 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rx67h\" (UniqueName: \"kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.000474 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.002528 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.026447 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rx67h\" (UniqueName: \"kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h\") pod \"must-gather-7zxj8\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.130023 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.571297 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rfp5n/must-gather-7zxj8"] Mar 07 07:42:44 crc kubenswrapper[4738]: I0307 07:42:44.769980 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" event={"ID":"2e8ecd28-5ba0-430f-9f73-b1450163a008","Type":"ContainerStarted","Data":"7768f57c6afd4c455768c1275f22f4b5a7e3fc35176f081a1aae1f8f57bba901"} Mar 07 07:42:45 crc kubenswrapper[4738]: I0307 07:42:45.386330 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:42:45 crc kubenswrapper[4738]: E0307 07:42:45.386700 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:42:48 crc kubenswrapper[4738]: I0307 07:42:48.627546 4738 scope.go:117] "RemoveContainer" containerID="4e236a0f9a3254972c2324ba0316af7a8f3f2d67a9f186dfa09700fe29c29c46" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.266972 4738 scope.go:117] "RemoveContainer" containerID="91c1fdf9fdf78e2c489b8b769f3bf378e60fb6a872f67d43d9fc4f7405cc9756" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.361275 4738 scope.go:117] "RemoveContainer" containerID="e7b1ff02080c8489ff785e1ab9c168a5705a38b13bf7679b835d07fcafff69f4" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.392498 4738 scope.go:117] "RemoveContainer" containerID="96396f0f7143c90fc689a396468e6743f8cdeac5e8a2ded6b6d45420ea48a766" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.457937 4738 scope.go:117] "RemoveContainer" containerID="4e355a8f5aa677f793af95b57852a95a785489978e5fb65aaa821790532c8429" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.485406 4738 scope.go:117] "RemoveContainer" containerID="a396130de5bdade456323f174fc65b9350f0dc57450b18af6974f1cc71c2d222" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.524766 4738 scope.go:117] "RemoveContainer" containerID="02f982255b9dc309be78b72406d5192924fd1c2aa8dce3e2b2ed17c68eb7e777" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.562249 4738 scope.go:117] "RemoveContainer" containerID="67d2be20671c313b66cc7a0bfcd324ba91a11ed5a0980040772de4ba3b34d5bf" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.625416 4738 scope.go:117] "RemoveContainer" containerID="637f6298b77264298786b4519fc2e83a968f3ac127127a960847f9d3e4cbee71" Mar 07 07:42:52 crc kubenswrapper[4738]: I0307 07:42:52.874707 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" event={"ID":"2e8ecd28-5ba0-430f-9f73-b1450163a008","Type":"ContainerStarted","Data":"754e750ffc2dd6feca5a17f1f0c64acd3a1d2acd2c23b31777d017d2b908b2a1"} Mar 07 07:42:53 crc kubenswrapper[4738]: I0307 07:42:53.886198 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" event={"ID":"2e8ecd28-5ba0-430f-9f73-b1450163a008","Type":"ContainerStarted","Data":"8bd2f809e8e03406435aef75bab7ef1d2f31f2bf40fa619b90690305722342b6"} Mar 07 07:42:53 crc kubenswrapper[4738]: I0307 07:42:53.902497 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" podStartSLOduration=3.094473957 podStartE2EDuration="10.902467516s" podCreationTimestamp="2026-03-07 07:42:43 +0000 UTC" firstStartedPulling="2026-03-07 07:42:44.585104385 +0000 UTC m=+2583.050091706" lastFinishedPulling="2026-03-07 07:42:52.393097934 +0000 UTC m=+2590.858085265" observedRunningTime="2026-03-07 07:42:53.900061412 +0000 UTC m=+2592.365048753" watchObservedRunningTime="2026-03-07 07:42:53.902467516 +0000 UTC m=+2592.367454867" Mar 07 07:43:00 crc kubenswrapper[4738]: I0307 07:43:00.385307 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:43:00 crc kubenswrapper[4738]: E0307 07:43:00.385963 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:43:13 crc kubenswrapper[4738]: I0307 07:43:13.385830 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:43:13 crc kubenswrapper[4738]: E0307 07:43:13.386779 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:43:25 crc kubenswrapper[4738]: I0307 07:43:25.386140 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:43:25 crc kubenswrapper[4738]: E0307 07:43:25.387021 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.244130 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/util/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.437509 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/pull/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.437629 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/util/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.465231 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/pull/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.592392 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/util/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.627229 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/extract/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.666347 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_449fb6470c1912a82e78329a391f5fcf195fc98f0db032e4e564b36c62pqff6_b77797b2-b93a-4d76-9191-f334fde745fa/pull/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.754323 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/util/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.971885 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/util/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.974111 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/pull/0.log" Mar 07 07:43:34 crc kubenswrapper[4738]: I0307 07:43:34.982475 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/pull/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.159905 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.174936 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/extract/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.175003 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_7df1acda6d491701507c29099a5c44d8584fa8400ab5bcb9bc51d65ba87xntc_c2b59545-69a9-477d-857c-2134d53edc2d/pull/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.309870 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.454406 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.469247 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/pull/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.512037 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/pull/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.654147 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.675856 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/extract/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.704695 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e590jb4xq_32694363-7ac1-464f-a71a-142236e5eab8/pull/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.828993 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.994900 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/util/0.log" Mar 07 07:43:35 crc kubenswrapper[4738]: I0307 07:43:35.997670 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/pull/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.015272 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/pull/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.192740 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/extract/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.230458 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/util/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.232521 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a8831b5415c61d2070187f89bbd36b8acd1ec1f7bfd7a0222b24d4bc40k45pm_51f3d627-e5bd-4ffe-8b54-79c4fd8a86ef/pull/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.385334 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:43:36 crc kubenswrapper[4738]: E0307 07:43:36.385730 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.403500 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-index-76wqn_3b87b2f5-b8bf-451f-907e-c47b1c197381/registry-server/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.572430 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/util/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.794483 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/pull/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.826982 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/pull/0.log" Mar 07 07:43:36 crc kubenswrapper[4738]: I0307 07:43:36.865673 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/util/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.005174 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/util/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.072259 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/extract/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.083881 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5c8d1396f902151618bcfa62374f22efa6e83c406f0e9e69306ef85039vzp6_9ab2f06d-a208-4b05-a799-10ff718be96f/pull/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.231647 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/util/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.601411 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/pull/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.663835 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/util/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.687113 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/pull/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.893437 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/util/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.920503 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/pull/0.log" Mar 07 07:43:37 crc kubenswrapper[4738]: I0307 07:43:37.977005 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_c5dd21a97976f00f72ca6d3429b14fbdb328dc08cfd92939eb7c42ec8cz7w28_ff8317ef-e259-4059-b434-33fdfd265c20/extract/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.087287 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-676c9dd8fb-jzgbt_22276e47-e857-4b2f-a2ef-0976a50c1294/manager/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.177445 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-index-vmk7s_052550e0-03bd-4c62-8393-b43abaafb198/registry-server/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.314774 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-67bb98769b-bs7fw_02e16c67-848e-4269-bb82-09ace460b9fe/manager/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.434271 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-index-qg2d2_1d4db5b1-0bd5-45dd-a926-40d569a6eeb5/registry-server/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.632026 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-7db7f5474b-5qcsw_157f5b45-2eb2-4da7-bea9-722b108fa769/manager/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.676244 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-index-qzgzb_c6ada68d-3a9c-41c5-a8a4-1abe8cf757b9/registry-server/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.765020 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-779fc9694b-nbhcm_fb842fb6-8fb7-4601-a0be-bab337f47d4a/operator/0.log" Mar 07 07:43:38 crc kubenswrapper[4738]: I0307 07:43:38.962648 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-index-xmcn5_5bad6865-faaa-4032-b282-4d5a699bd4e2/registry-server/0.log" Mar 07 07:43:39 crc kubenswrapper[4738]: I0307 07:43:39.100172 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f949bdfdb-rrqv4_86f1965b-c0fd-4540-909f-301789d064af/manager/0.log" Mar 07 07:43:39 crc kubenswrapper[4738]: I0307 07:43:39.127807 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-648b4464d8-k2jvm_b5c35317-4298-4385-b73f-e73d48fb756e/manager/0.log" Mar 07 07:43:39 crc kubenswrapper[4738]: I0307 07:43:39.156390 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-index-kq2lh_5a80278c-d06b-48e6-8ed1-129980ae7ba3/registry-server/0.log" Mar 07 07:43:48 crc kubenswrapper[4738]: I0307 07:43:48.386080 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:43:48 crc kubenswrapper[4738]: E0307 07:43:48.387121 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.807928 4738 scope.go:117] "RemoveContainer" containerID="729d3c4fce71dd57fa14a7fb3ea42ec97e47fd51ed82a4eef3c941bddec9d9fa" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.835895 4738 scope.go:117] "RemoveContainer" containerID="939e1a1a1a6c45e6bf61b0123fb97be744abe24c28700c8b58ade411cbe7e427" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.893124 4738 scope.go:117] "RemoveContainer" containerID="5d6ad447966589bd7c4379476fa982d1b7d5e58c5c678cf6236fee0c8eb8b5fe" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.920581 4738 scope.go:117] "RemoveContainer" containerID="cee3ce6883eddcd166e4b294be70391336e58febb18ad07a6bd8b0bc7253db5a" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.951725 4738 scope.go:117] "RemoveContainer" containerID="561f066cbf67bfe70ed5681bc18474871b252207a764fa4fd80eec35aab167ca" Mar 07 07:43:52 crc kubenswrapper[4738]: I0307 07:43:52.985002 4738 scope.go:117] "RemoveContainer" containerID="2db3f6488469b75ebc303eb6dd98c372bb59c32f16936046fbc8e6066e3203f1" Mar 07 07:43:53 crc kubenswrapper[4738]: I0307 07:43:53.021945 4738 scope.go:117] "RemoveContainer" containerID="4f7832d62a82709b88926b7516b6cc2660470adbc0a1edcc9db6c1c66d22a664" Mar 07 07:43:53 crc kubenswrapper[4738]: I0307 07:43:53.048610 4738 scope.go:117] "RemoveContainer" containerID="9f950ef790d657780bf45e4e9a8892c41c1ad1ba0acc788dced915075bf9072d" Mar 07 07:43:53 crc kubenswrapper[4738]: I0307 07:43:53.066696 4738 scope.go:117] "RemoveContainer" containerID="d8032b66c77f34fa43304d23968c8c4872fafc507fdede388f6ee8713a5e983d" Mar 07 07:43:53 crc kubenswrapper[4738]: I0307 07:43:53.097221 4738 scope.go:117] "RemoveContainer" containerID="9ccf16f1c9927b36bda9e666d4367ef544cafd2b34b5d9b7be8c3a1fa4c90b4d" Mar 07 07:43:54 crc kubenswrapper[4738]: I0307 07:43:54.020950 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-f9m4m_6b20a433-e331-4429-9cc9-041080de016b/control-plane-machine-set-operator/0.log" Mar 07 07:43:54 crc kubenswrapper[4738]: I0307 07:43:54.199529 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjdr5_de384384-33aa-408f-a998-bf6b3e79e0a8/kube-rbac-proxy/0.log" Mar 07 07:43:54 crc kubenswrapper[4738]: I0307 07:43:54.225143 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wjdr5_de384384-33aa-408f-a998-bf6b3e79e0a8/machine-api-operator/0.log" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.146447 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547824-kkjb7"] Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.148245 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.150061 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.151240 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.152047 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.160815 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-kkjb7"] Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.241545 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56vdd\" (UniqueName: \"kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd\") pod \"auto-csr-approver-29547824-kkjb7\" (UID: \"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa\") " pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.343372 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56vdd\" (UniqueName: \"kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd\") pod \"auto-csr-approver-29547824-kkjb7\" (UID: \"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa\") " pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.373274 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56vdd\" (UniqueName: \"kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd\") pod \"auto-csr-approver-29547824-kkjb7\" (UID: \"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa\") " pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.464306 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:00 crc kubenswrapper[4738]: I0307 07:44:00.887073 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-kkjb7"] Mar 07 07:44:01 crc kubenswrapper[4738]: I0307 07:44:01.385143 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:44:01 crc kubenswrapper[4738]: E0307 07:44:01.385428 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:44:01 crc kubenswrapper[4738]: I0307 07:44:01.537574 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" event={"ID":"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa","Type":"ContainerStarted","Data":"648c62e15aaeaee316281d68da1b8ef0ddce035ad4a59d24285ada0ee2a85ea1"} Mar 07 07:44:02 crc kubenswrapper[4738]: I0307 07:44:02.545342 4738 generic.go:334] "Generic (PLEG): container finished" podID="22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" containerID="f4800bb489ba2fd073e2e345b84b82822f470703d65ee021831a67a03d6914e5" exitCode=0 Mar 07 07:44:02 crc kubenswrapper[4738]: I0307 07:44:02.545414 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" event={"ID":"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa","Type":"ContainerDied","Data":"f4800bb489ba2fd073e2e345b84b82822f470703d65ee021831a67a03d6914e5"} Mar 07 07:44:03 crc kubenswrapper[4738]: I0307 07:44:03.829342 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:03 crc kubenswrapper[4738]: I0307 07:44:03.895459 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56vdd\" (UniqueName: \"kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd\") pod \"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa\" (UID: \"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa\") " Mar 07 07:44:03 crc kubenswrapper[4738]: I0307 07:44:03.903544 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd" (OuterVolumeSpecName: "kube-api-access-56vdd") pod "22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" (UID: "22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa"). InnerVolumeSpecName "kube-api-access-56vdd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:44:03 crc kubenswrapper[4738]: I0307 07:44:03.997899 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56vdd\" (UniqueName: \"kubernetes.io/projected/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa-kube-api-access-56vdd\") on node \"crc\" DevicePath \"\"" Mar 07 07:44:04 crc kubenswrapper[4738]: I0307 07:44:04.565115 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" event={"ID":"22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa","Type":"ContainerDied","Data":"648c62e15aaeaee316281d68da1b8ef0ddce035ad4a59d24285ada0ee2a85ea1"} Mar 07 07:44:04 crc kubenswrapper[4738]: I0307 07:44:04.565175 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547824-kkjb7" Mar 07 07:44:04 crc kubenswrapper[4738]: I0307 07:44:04.565184 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="648c62e15aaeaee316281d68da1b8ef0ddce035ad4a59d24285ada0ee2a85ea1" Mar 07 07:44:04 crc kubenswrapper[4738]: I0307 07:44:04.907142 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-mvsbs"] Mar 07 07:44:04 crc kubenswrapper[4738]: I0307 07:44:04.913587 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547818-mvsbs"] Mar 07 07:44:06 crc kubenswrapper[4738]: I0307 07:44:06.403980 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da05917f-8b20-4841-b237-1a0fe85e93e1" path="/var/lib/kubelet/pods/da05917f-8b20-4841-b237-1a0fe85e93e1/volumes" Mar 07 07:44:14 crc kubenswrapper[4738]: I0307 07:44:14.385447 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:44:14 crc kubenswrapper[4738]: E0307 07:44:14.386086 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:44:22 crc kubenswrapper[4738]: I0307 07:44:22.662654 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtspq_59fb62c4-aece-4eae-99d3-92872dc1c4de/kube-rbac-proxy/0.log" Mar 07 07:44:22 crc kubenswrapper[4738]: I0307 07:44:22.708959 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-86ddb6bd46-rtspq_59fb62c4-aece-4eae-99d3-92872dc1c4de/controller/0.log" Mar 07 07:44:22 crc kubenswrapper[4738]: I0307 07:44:22.838632 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-frr-files/0.log" Mar 07 07:44:22 crc kubenswrapper[4738]: I0307 07:44:22.978653 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-frr-files/0.log" Mar 07 07:44:22 crc kubenswrapper[4738]: I0307 07:44:22.991555 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-reloader/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.009633 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-reloader/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.015045 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-metrics/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.174353 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-frr-files/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.204831 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-reloader/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.210594 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-metrics/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.229188 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-metrics/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.377358 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-frr-files/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.388599 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-metrics/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.394807 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/controller/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.397144 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/cp-reloader/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.528667 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/frr-metrics/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.586173 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/kube-rbac-proxy/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.590566 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/kube-rbac-proxy-frr/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.735179 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/reloader/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.846026 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7f989f654f-2dvtg_0623b7d4-baca-4ea1-a095-551939ddc05f/frr-k8s-webhook-server/0.log" Mar 07 07:44:23 crc kubenswrapper[4738]: I0307 07:44:23.961749 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b66fc78fd-rzn97_756069cb-b586-444a-9f5a-48b3316677c2/manager/0.log" Mar 07 07:44:24 crc kubenswrapper[4738]: I0307 07:44:24.172942 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-78584b5797-gwhrq_e109b736-b857-43fd-821e-7f6f71490c99/webhook-server/0.log" Mar 07 07:44:24 crc kubenswrapper[4738]: I0307 07:44:24.257412 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpfpk_161f12dc-536f-41ca-89b8-4feca285a795/kube-rbac-proxy/0.log" Mar 07 07:44:24 crc kubenswrapper[4738]: I0307 07:44:24.457966 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vpfpk_161f12dc-536f-41ca-89b8-4feca285a795/speaker/0.log" Mar 07 07:44:24 crc kubenswrapper[4738]: I0307 07:44:24.972572 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-n9czq_69c55926-37e2-41f4-aaac-8d6aeacbfb47/frr/0.log" Mar 07 07:44:29 crc kubenswrapper[4738]: I0307 07:44:29.385378 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:44:29 crc kubenswrapper[4738]: E0307 07:44:29.386035 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.112592 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-57ff6dd9fd-sd8jl_85f7ad24-4e32-4833-b113-2f53c5fe23c2/barbican-api/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.207336 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-api-57ff6dd9fd-sd8jl_85f7ad24-4e32-4833-b113-2f53c5fe23c2/barbican-api-log/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.256418 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-db-sync-9lcxq_3032dbb3-8ffe-4712-a9b5-a57f22eebe73/barbican-db-sync/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.390712 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-7cddffb874-sgxs6_ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e/barbican-keystone-listener/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.430580 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-keystone-listener-7cddffb874-sgxs6_ee6a6f2f-a684-4ee3-8cb5-9405fa4f034e/barbican-keystone-listener-log/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.494807 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-76fbcdbdf7-hhpvj_31dd356b-08d4-49ca-81c6-37f7f89cc581/barbican-worker/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.575471 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_barbican-worker-76fbcdbdf7-hhpvj_31dd356b-08d4-49ca-81c6-37f7f89cc581/barbican-worker-log/0.log" Mar 07 07:44:38 crc kubenswrapper[4738]: I0307 07:44:38.931743 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_a9054324-a882-4b95-adda-bde2ea2ab268/mysql-bootstrap/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.061570 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_keystone-686f9bb899-4llf4_fc363915-16ef-41ad-9d4a-542abfe49889/keystone-api/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.119473 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_a9054324-a882-4b95-adda-bde2ea2ab268/mysql-bootstrap/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.167293 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-0_a9054324-a882-4b95-adda-bde2ea2ab268/galera/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.343549 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_acf747a6-cf73-4b69-a1b7-bf95cde29f63/mysql-bootstrap/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.430905 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_acf747a6-cf73-4b69-a1b7-bf95cde29f63/mysql-bootstrap/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.488485 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-1_acf747a6-cf73-4b69-a1b7-bf95cde29f63/galera/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.629603 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_178ed054-c2f4-450d-99b8-88fd079432cf/mysql-bootstrap/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.884907 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_178ed054-c2f4-450d-99b8-88fd079432cf/galera/0.log" Mar 07 07:44:39 crc kubenswrapper[4738]: I0307 07:44:39.912552 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_openstack-galera-2_178ed054-c2f4-450d-99b8-88fd079432cf/mysql-bootstrap/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.070521 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_3f61d3e1-a31b-4e55-8826-0613669b010c/setup-container/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.317544 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_3f61d3e1-a31b-4e55-8826-0613669b010c/rabbitmq/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.346314 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_rabbitmq-server-0_3f61d3e1-a31b-4e55-8826-0613669b010c/setup-container/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.548774 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-rj46d_128745d7-e885-48bf-88fe-22bacab590d5/proxy-httpd/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.599549 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-proxy-76c998454c-rj46d_128745d7-e885-48bf-88fe-22bacab590d5/proxy-server/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.799087 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-ring-rebalance-8czg5_8f789001-3731-4f60-8a8a-8a41cb5a8ee8/swift-ring-rebalance/0.log" Mar 07 07:44:40 crc kubenswrapper[4738]: I0307 07:44:40.832652 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/account-auditor/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.011339 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/account-reaper/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.072047 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_memcached-0_18680f34-6448-42b6-bb6d-c87e7f1becb7/memcached/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.074051 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/account-server/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.074137 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/account-replicator/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.189345 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/container-auditor/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.240251 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/container-replicator/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.241585 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/container-updater/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.252018 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/container-server/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.350884 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/object-auditor/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.396186 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/object-server/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.414568 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/object-replicator/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.415483 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/object-expirer/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.516574 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/object-updater/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.563880 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/rsync/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.582010 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-0_4cca6511-1923-4d3b-9354-5f35fd664d64/swift-recon-cron/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.699768 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/account-auditor/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.726955 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/account-reaper/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.749450 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/account-replicator/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.774963 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/account-server/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.872507 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/container-auditor/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.908617 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/container-server/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.975608 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/container-replicator/0.log" Mar 07 07:44:41 crc kubenswrapper[4738]: I0307 07:44:41.982151 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/container-updater/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.072172 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/object-expirer/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.084376 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/object-auditor/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.147682 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/object-server/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.180720 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/object-replicator/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.225678 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/object-updater/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.257792 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/rsync/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.264084 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-1_04bf7f33-d023-4bab-be79-70348bf80391/swift-recon-cron/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.385016 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/account-reaper/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.406937 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/account-auditor/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.479059 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/account-replicator/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.494776 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/account-server/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.528623 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/container-auditor/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.574151 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/container-server/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.586912 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/container-replicator/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.654853 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/container-updater/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.676815 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/object-auditor/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.688072 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/object-expirer/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.735576 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/object-replicator/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.765482 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/object-server/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.834176 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/rsync/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.835355 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/object-updater/0.log" Mar 07 07:44:42 crc kubenswrapper[4738]: I0307 07:44:42.844662 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/swift-kuttl-tests_swift-storage-2_b8c4676c-c7a7-46e8-95d0-ec6656e32fe1/swift-recon-cron/0.log" Mar 07 07:44:44 crc kubenswrapper[4738]: I0307 07:44:44.385349 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:44:44 crc kubenswrapper[4738]: E0307 07:44:44.385557 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.293984 4738 scope.go:117] "RemoveContainer" containerID="3547d7d0d99a9a179f33599dfbd628b73764e11f7e930b2b19628c889279422b" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.324681 4738 scope.go:117] "RemoveContainer" containerID="aa803c8fd3aac46126b5fda6b8a366f844a7664a9f0802a89e2caf80abbf2fae" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.457460 4738 scope.go:117] "RemoveContainer" containerID="49c0a505c0be687a79cbbfa717da8f6bb0a6ddbc00a465d224a2de7d385f1cfe" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.479278 4738 scope.go:117] "RemoveContainer" containerID="a5b46dbd76fc1e2ea069230f1601d39d5ac3ba38684f03a7107a5373d431d084" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.520880 4738 scope.go:117] "RemoveContainer" containerID="6eab9ec531abbd2644ab680063aed1693ac3bb8f83c9d53d28eb1525b47154c7" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.546947 4738 scope.go:117] "RemoveContainer" containerID="bbad7c893fe39e697cfa77a86f58e03b48f14de718b06021a61cc13359cee452" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.584744 4738 scope.go:117] "RemoveContainer" containerID="c1c073b1c2df0ddd6393eb070b8e8267f2f1ad27bdae61d288dbd1ea0e6211d0" Mar 07 07:44:53 crc kubenswrapper[4738]: I0307 07:44:53.616478 4738 scope.go:117] "RemoveContainer" containerID="39b267373222952ff87d6b5f4f9b66f561747b5f437b04f412706bcb4a59cf4d" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.365782 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-utilities/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.578310 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-utilities/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.593373 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-content/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.605764 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-content/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.806485 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-utilities/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.819333 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/extract-content/0.log" Mar 07 07:44:56 crc kubenswrapper[4738]: I0307 07:44:56.981743 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-utilities/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.207663 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-utilities/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.224505 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-content/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.239363 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-content/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.241086 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-qchnm_a530d643-c7f4-4f6c-8ff7-b256b67d6764/registry-server/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.385503 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:44:57 crc kubenswrapper[4738]: E0307 07:44:57.385846 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.405531 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-content/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.433924 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/extract-utilities/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.633311 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-629nq_cd6468af-0f60-43a2-823a-abdfd6f3fbb4/registry-server/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.651302 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/util/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.798244 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/pull/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.814899 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/util/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.830197 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/pull/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.967550 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/util/0.log" Mar 07 07:44:57 crc kubenswrapper[4738]: I0307 07:44:57.979532 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/pull/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.022378 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4226qg_b3d9bf10-cf58-4926-9c16-fe8e0f322287/extract/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.118827 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-qzsqg_16598a41-f8be-4b82-84c1-b718c0b24b8e/marketplace-operator/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.174103 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-utilities/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.336296 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-content/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.350113 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-utilities/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.362941 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-content/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.522465 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-content/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.536881 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/extract-utilities/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.666440 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6glbr_5c8341b7-746a-448e-92e5-b7011a75e332/registry-server/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.692678 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-utilities/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.876928 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-content/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.879730 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-utilities/0.log" Mar 07 07:44:58 crc kubenswrapper[4738]: I0307 07:44:58.896472 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-content/0.log" Mar 07 07:44:59 crc kubenswrapper[4738]: I0307 07:44:59.048911 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-content/0.log" Mar 07 07:44:59 crc kubenswrapper[4738]: I0307 07:44:59.053754 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/extract-utilities/0.log" Mar 07 07:44:59 crc kubenswrapper[4738]: I0307 07:44:59.540209 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9t248_3aa7e15f-67b8-491e-b579-a52873583f7f/registry-server/0.log" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.148553 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l"] Mar 07 07:45:00 crc kubenswrapper[4738]: E0307 07:45:00.149144 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.149179 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.149350 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" containerName="oc" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.149853 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.151524 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.152875 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.158902 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l"] Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.235320 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.235396 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc2kx\" (UniqueName: \"kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.235435 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.336461 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.336842 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc2kx\" (UniqueName: \"kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.336962 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.338072 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.347927 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.356953 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc2kx\" (UniqueName: \"kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx\") pod \"collect-profiles-29547825-b645l\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.481071 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:00 crc kubenswrapper[4738]: I0307 07:45:00.953049 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l"] Mar 07 07:45:00 crc kubenswrapper[4738]: W0307 07:45:00.964308 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52e77c9d_c5e8_4032_9dc3_d8031afb470d.slice/crio-0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac WatchSource:0}: Error finding container 0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac: Status 404 returned error can't find the container with id 0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac Mar 07 07:45:01 crc kubenswrapper[4738]: I0307 07:45:01.020725 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" event={"ID":"52e77c9d-c5e8-4032-9dc3-d8031afb470d","Type":"ContainerStarted","Data":"0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac"} Mar 07 07:45:02 crc kubenswrapper[4738]: I0307 07:45:02.035295 4738 generic.go:334] "Generic (PLEG): container finished" podID="52e77c9d-c5e8-4032-9dc3-d8031afb470d" containerID="180c0ea291d32c7a3151d7abacfd085188393169f713e9ec9548e23b6f0aa6dc" exitCode=0 Mar 07 07:45:02 crc kubenswrapper[4738]: I0307 07:45:02.035920 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" event={"ID":"52e77c9d-c5e8-4032-9dc3-d8031afb470d","Type":"ContainerDied","Data":"180c0ea291d32c7a3151d7abacfd085188393169f713e9ec9548e23b6f0aa6dc"} Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.309197 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.482168 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc2kx\" (UniqueName: \"kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx\") pod \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.482266 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume\") pod \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.482330 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume\") pod \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\" (UID: \"52e77c9d-c5e8-4032-9dc3-d8031afb470d\") " Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.483135 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume" (OuterVolumeSpecName: "config-volume") pod "52e77c9d-c5e8-4032-9dc3-d8031afb470d" (UID: "52e77c9d-c5e8-4032-9dc3-d8031afb470d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.487482 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx" (OuterVolumeSpecName: "kube-api-access-jc2kx") pod "52e77c9d-c5e8-4032-9dc3-d8031afb470d" (UID: "52e77c9d-c5e8-4032-9dc3-d8031afb470d"). InnerVolumeSpecName "kube-api-access-jc2kx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.487662 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "52e77c9d-c5e8-4032-9dc3-d8031afb470d" (UID: "52e77c9d-c5e8-4032-9dc3-d8031afb470d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.583847 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc2kx\" (UniqueName: \"kubernetes.io/projected/52e77c9d-c5e8-4032-9dc3-d8031afb470d-kube-api-access-jc2kx\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.583887 4738 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/52e77c9d-c5e8-4032-9dc3-d8031afb470d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:03 crc kubenswrapper[4738]: I0307 07:45:03.583901 4738 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/52e77c9d-c5e8-4032-9dc3-d8031afb470d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 07 07:45:04 crc kubenswrapper[4738]: I0307 07:45:04.055730 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" event={"ID":"52e77c9d-c5e8-4032-9dc3-d8031afb470d","Type":"ContainerDied","Data":"0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac"} Mar 07 07:45:04 crc kubenswrapper[4738]: I0307 07:45:04.056090 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0429fd56d20290d4ce28e5171067dcc05c85e7bf036890e5bbade0c1600bcfac" Mar 07 07:45:04 crc kubenswrapper[4738]: I0307 07:45:04.055781 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29547825-b645l" Mar 07 07:45:04 crc kubenswrapper[4738]: I0307 07:45:04.377899 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb"] Mar 07 07:45:04 crc kubenswrapper[4738]: I0307 07:45:04.395016 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29547780-62qtb"] Mar 07 07:45:06 crc kubenswrapper[4738]: I0307 07:45:06.393179 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1528346-60ec-4879-81b0-72027f1d1477" path="/var/lib/kubelet/pods/e1528346-60ec-4879-81b0-72027f1d1477/volumes" Mar 07 07:45:12 crc kubenswrapper[4738]: I0307 07:45:12.390822 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:45:12 crc kubenswrapper[4738]: E0307 07:45:12.391759 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:45:25 crc kubenswrapper[4738]: I0307 07:45:25.386479 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:45:25 crc kubenswrapper[4738]: E0307 07:45:25.387594 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:45:36 crc kubenswrapper[4738]: I0307 07:45:36.390804 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:45:36 crc kubenswrapper[4738]: E0307 07:45:36.391687 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:45:49 crc kubenswrapper[4738]: I0307 07:45:49.386291 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:45:49 crc kubenswrapper[4738]: E0307 07:45:49.387309 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.776726 4738 scope.go:117] "RemoveContainer" containerID="fa20a124747707551ac51b7142e65d808067cf5bbd2163af96dc582826363c37" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.828807 4738 scope.go:117] "RemoveContainer" containerID="9f80f2af8c4e7c35651be8becd660a0956c226418eb9ea9fad6d192a44467c1d" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.863489 4738 scope.go:117] "RemoveContainer" containerID="765e90488d1127368807c1f5d5a90aa5f37788f6f8bcd4327d556eb6b87acdc8" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.893546 4738 scope.go:117] "RemoveContainer" containerID="5dd28070e510fc57d943386571d5f4061c9362505533716af779702c2b32a921" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.913364 4738 scope.go:117] "RemoveContainer" containerID="9521ab12d63a43a24ce0f7cb4ab290e6a2e30dd38787c73e47bfbadfd4482144" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.942224 4738 scope.go:117] "RemoveContainer" containerID="7521e3e6d4c22efe37d1fb931b5041f97959d49153a5c334abbc750e3d5e5d23" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.970424 4738 scope.go:117] "RemoveContainer" containerID="0dec7506154bca4e104a462cf78c6947214246c96a1259917c9874822328c63f" Mar 07 07:45:53 crc kubenswrapper[4738]: I0307 07:45:53.990881 4738 scope.go:117] "RemoveContainer" containerID="a273d92a6ebc4a0727ca734e4c4843ac899106309db60192f8077a59a167418c" Mar 07 07:45:54 crc kubenswrapper[4738]: I0307 07:45:54.016009 4738 scope.go:117] "RemoveContainer" containerID="fac926deeddc1f21c400244b6cd3a55368d5330a4b28a7895fda3987b0ce8abb" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.165607 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547826-qd2hw"] Mar 07 07:46:00 crc kubenswrapper[4738]: E0307 07:46:00.166707 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52e77c9d-c5e8-4032-9dc3-d8031afb470d" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.166723 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="52e77c9d-c5e8-4032-9dc3-d8031afb470d" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.166937 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="52e77c9d-c5e8-4032-9dc3-d8031afb470d" containerName="collect-profiles" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.167519 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.169520 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.171257 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.174728 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.185681 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-qd2hw"] Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.299647 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkwj8\" (UniqueName: \"kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8\") pod \"auto-csr-approver-29547826-qd2hw\" (UID: \"1eb3fea1-4c42-4b0e-920b-825787d920af\") " pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.402081 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkwj8\" (UniqueName: \"kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8\") pod \"auto-csr-approver-29547826-qd2hw\" (UID: \"1eb3fea1-4c42-4b0e-920b-825787d920af\") " pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.436637 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkwj8\" (UniqueName: \"kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8\") pod \"auto-csr-approver-29547826-qd2hw\" (UID: \"1eb3fea1-4c42-4b0e-920b-825787d920af\") " pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.486749 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.973108 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547826-qd2hw"] Mar 07 07:46:00 crc kubenswrapper[4738]: I0307 07:46:00.973589 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:46:01 crc kubenswrapper[4738]: I0307 07:46:01.386111 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:46:01 crc kubenswrapper[4738]: E0307 07:46:01.386421 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:46:01 crc kubenswrapper[4738]: I0307 07:46:01.694182 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" event={"ID":"1eb3fea1-4c42-4b0e-920b-825787d920af","Type":"ContainerStarted","Data":"a066f0f191d3e7afc9beab1e431507616e08b44784c4870d58c5c1fbd94477c5"} Mar 07 07:46:02 crc kubenswrapper[4738]: I0307 07:46:02.708382 4738 generic.go:334] "Generic (PLEG): container finished" podID="1eb3fea1-4c42-4b0e-920b-825787d920af" containerID="9835a1f4eab001dfc7622c4667e21285287dd298bdfe82f5a5becaecc525a12c" exitCode=0 Mar 07 07:46:02 crc kubenswrapper[4738]: I0307 07:46:02.708855 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" event={"ID":"1eb3fea1-4c42-4b0e-920b-825787d920af","Type":"ContainerDied","Data":"9835a1f4eab001dfc7622c4667e21285287dd298bdfe82f5a5becaecc525a12c"} Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.030849 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.086079 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkwj8\" (UniqueName: \"kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8\") pod \"1eb3fea1-4c42-4b0e-920b-825787d920af\" (UID: \"1eb3fea1-4c42-4b0e-920b-825787d920af\") " Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.095695 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8" (OuterVolumeSpecName: "kube-api-access-lkwj8") pod "1eb3fea1-4c42-4b0e-920b-825787d920af" (UID: "1eb3fea1-4c42-4b0e-920b-825787d920af"). InnerVolumeSpecName "kube-api-access-lkwj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.187908 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkwj8\" (UniqueName: \"kubernetes.io/projected/1eb3fea1-4c42-4b0e-920b-825787d920af-kube-api-access-lkwj8\") on node \"crc\" DevicePath \"\"" Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.754865 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" event={"ID":"1eb3fea1-4c42-4b0e-920b-825787d920af","Type":"ContainerDied","Data":"a066f0f191d3e7afc9beab1e431507616e08b44784c4870d58c5c1fbd94477c5"} Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.754922 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a066f0f191d3e7afc9beab1e431507616e08b44784c4870d58c5c1fbd94477c5" Mar 07 07:46:04 crc kubenswrapper[4738]: I0307 07:46:04.754956 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547826-qd2hw" Mar 07 07:46:05 crc kubenswrapper[4738]: I0307 07:46:05.113943 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-788v7"] Mar 07 07:46:05 crc kubenswrapper[4738]: I0307 07:46:05.124928 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547820-788v7"] Mar 07 07:46:06 crc kubenswrapper[4738]: I0307 07:46:06.396345 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f325caa-c71f-44ae-8312-2241b9b8bc67" path="/var/lib/kubelet/pods/4f325caa-c71f-44ae-8312-2241b9b8bc67/volumes" Mar 07 07:46:12 crc kubenswrapper[4738]: I0307 07:46:12.829616 4738 generic.go:334] "Generic (PLEG): container finished" podID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerID="754e750ffc2dd6feca5a17f1f0c64acd3a1d2acd2c23b31777d017d2b908b2a1" exitCode=0 Mar 07 07:46:12 crc kubenswrapper[4738]: I0307 07:46:12.829693 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" event={"ID":"2e8ecd28-5ba0-430f-9f73-b1450163a008","Type":"ContainerDied","Data":"754e750ffc2dd6feca5a17f1f0c64acd3a1d2acd2c23b31777d017d2b908b2a1"} Mar 07 07:46:12 crc kubenswrapper[4738]: I0307 07:46:12.830979 4738 scope.go:117] "RemoveContainer" containerID="754e750ffc2dd6feca5a17f1f0c64acd3a1d2acd2c23b31777d017d2b908b2a1" Mar 07 07:46:13 crc kubenswrapper[4738]: I0307 07:46:13.364570 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rfp5n_must-gather-7zxj8_2e8ecd28-5ba0-430f-9f73-b1450163a008/gather/0.log" Mar 07 07:46:14 crc kubenswrapper[4738]: I0307 07:46:14.387271 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:46:14 crc kubenswrapper[4738]: E0307 07:46:14.387591 4738 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-t7vcc_openshift-machine-config-operator(0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7)\"" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" Mar 07 07:46:20 crc kubenswrapper[4738]: I0307 07:46:20.638516 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-rfp5n/must-gather-7zxj8"] Mar 07 07:46:20 crc kubenswrapper[4738]: I0307 07:46:20.639442 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="copy" containerID="cri-o://8bd2f809e8e03406435aef75bab7ef1d2f31f2bf40fa619b90690305722342b6" gracePeriod=2 Mar 07 07:46:20 crc kubenswrapper[4738]: I0307 07:46:20.647387 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-rfp5n/must-gather-7zxj8"] Mar 07 07:46:20 crc kubenswrapper[4738]: I0307 07:46:20.914804 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rfp5n_must-gather-7zxj8_2e8ecd28-5ba0-430f-9f73-b1450163a008/copy/0.log" Mar 07 07:46:20 crc kubenswrapper[4738]: I0307 07:46:20.915482 4738 generic.go:334] "Generic (PLEG): container finished" podID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerID="8bd2f809e8e03406435aef75bab7ef1d2f31f2bf40fa619b90690305722342b6" exitCode=143 Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.021298 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rfp5n_must-gather-7zxj8_2e8ecd28-5ba0-430f-9f73-b1450163a008/copy/0.log" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.021830 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.058013 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output\") pod \"2e8ecd28-5ba0-430f-9f73-b1450163a008\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.058260 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rx67h\" (UniqueName: \"kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h\") pod \"2e8ecd28-5ba0-430f-9f73-b1450163a008\" (UID: \"2e8ecd28-5ba0-430f-9f73-b1450163a008\") " Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.066042 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h" (OuterVolumeSpecName: "kube-api-access-rx67h") pod "2e8ecd28-5ba0-430f-9f73-b1450163a008" (UID: "2e8ecd28-5ba0-430f-9f73-b1450163a008"). InnerVolumeSpecName "kube-api-access-rx67h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.136841 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "2e8ecd28-5ba0-430f-9f73-b1450163a008" (UID: "2e8ecd28-5ba0-430f-9f73-b1450163a008"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.159890 4738 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/2e8ecd28-5ba0-430f-9f73-b1450163a008-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.160105 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rx67h\" (UniqueName: \"kubernetes.io/projected/2e8ecd28-5ba0-430f-9f73-b1450163a008-kube-api-access-rx67h\") on node \"crc\" DevicePath \"\"" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.925021 4738 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-rfp5n_must-gather-7zxj8_2e8ecd28-5ba0-430f-9f73-b1450163a008/copy/0.log" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.925477 4738 scope.go:117] "RemoveContainer" containerID="8bd2f809e8e03406435aef75bab7ef1d2f31f2bf40fa619b90690305722342b6" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.925531 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rfp5n/must-gather-7zxj8" Mar 07 07:46:21 crc kubenswrapper[4738]: I0307 07:46:21.947263 4738 scope.go:117] "RemoveContainer" containerID="754e750ffc2dd6feca5a17f1f0c64acd3a1d2acd2c23b31777d017d2b908b2a1" Mar 07 07:46:22 crc kubenswrapper[4738]: I0307 07:46:22.395592 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" path="/var/lib/kubelet/pods/2e8ecd28-5ba0-430f-9f73-b1450163a008/volumes" Mar 07 07:46:27 crc kubenswrapper[4738]: I0307 07:46:27.386718 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:46:27 crc kubenswrapper[4738]: I0307 07:46:27.997588 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"8b6cb246b831f526704e981293eee2a64b63091b8dbaca6aa0bf12d927dad461"} Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.181042 4738 scope.go:117] "RemoveContainer" containerID="b971718c5b4a2f871edb9241bc8aa233afa17f97673e268677abe2711aa49fc2" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.227185 4738 scope.go:117] "RemoveContainer" containerID="a8ded881e7942ac605e669272761fb6817e39ffc813409b452b44646ecff79b1" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.256581 4738 scope.go:117] "RemoveContainer" containerID="9ac421ac9a273f592cdfd7681b68d521cebc53b64a606b7905a3cd78d7639e2c" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.300109 4738 scope.go:117] "RemoveContainer" containerID="9aa776cfcdbab9636869a5ea15ac1b0a266f0a350ded10223bbf16ea8f246c1c" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.332693 4738 scope.go:117] "RemoveContainer" containerID="4e01c39fab11c7f0a3ec8732745f40155ef723f801743e383953a57cad658908" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.367399 4738 scope.go:117] "RemoveContainer" containerID="b13bafba945e0d6abf55d42d8391c8fcbf9cd0a204c967dc3f1d4313a0f56b1f" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.394133 4738 scope.go:117] "RemoveContainer" containerID="970881ca3efb891b5ebf507eeacd6bb1b036e04b5f582affe53a0c08bd013c5c" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.419943 4738 scope.go:117] "RemoveContainer" containerID="b10c17d9490e9914efc5e0ac5da845a28f8b08a16a98e4c4781722cb12288844" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.446088 4738 scope.go:117] "RemoveContainer" containerID="b1dd50e628df8e935a3aea935e54a1b59e7007b3551ee2d3f78864bcde759079" Mar 07 07:46:54 crc kubenswrapper[4738]: I0307 07:46:54.463922 4738 scope.go:117] "RemoveContainer" containerID="1d6f8543e7c8a885d5b9742fdd7c136bdab0a4aff38306f24a5ad8fdbb3f7163" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.262304 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:12 crc kubenswrapper[4738]: E0307 07:47:12.263359 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eb3fea1-4c42-4b0e-920b-825787d920af" containerName="oc" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263377 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eb3fea1-4c42-4b0e-920b-825787d920af" containerName="oc" Mar 07 07:47:12 crc kubenswrapper[4738]: E0307 07:47:12.263406 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="copy" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263416 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="copy" Mar 07 07:47:12 crc kubenswrapper[4738]: E0307 07:47:12.263447 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="gather" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263458 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="gather" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263680 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="copy" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263707 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2e8ecd28-5ba0-430f-9f73-b1450163a008" containerName="gather" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.263719 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eb3fea1-4c42-4b0e-920b-825787d920af" containerName="oc" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.265823 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.277119 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.288998 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhb9s\" (UniqueName: \"kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.289074 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.289141 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.390032 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhb9s\" (UniqueName: \"kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.390083 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.390122 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.390574 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.390738 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.442066 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhb9s\" (UniqueName: \"kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s\") pod \"redhat-marketplace-6cz8z\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:12 crc kubenswrapper[4738]: I0307 07:47:12.587946 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:13 crc kubenswrapper[4738]: I0307 07:47:13.014783 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:13 crc kubenswrapper[4738]: I0307 07:47:13.451715 4738 generic.go:334] "Generic (PLEG): container finished" podID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerID="ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4" exitCode=0 Mar 07 07:47:13 crc kubenswrapper[4738]: I0307 07:47:13.451761 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerDied","Data":"ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4"} Mar 07 07:47:13 crc kubenswrapper[4738]: I0307 07:47:13.451790 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerStarted","Data":"802c72184925c6d89ddff7e024c8f21bd4a19321de0b335e8b89d9714218b79e"} Mar 07 07:47:14 crc kubenswrapper[4738]: I0307 07:47:14.479370 4738 generic.go:334] "Generic (PLEG): container finished" podID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerID="42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be" exitCode=0 Mar 07 07:47:14 crc kubenswrapper[4738]: I0307 07:47:14.479519 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerDied","Data":"42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be"} Mar 07 07:47:15 crc kubenswrapper[4738]: I0307 07:47:15.491682 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerStarted","Data":"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317"} Mar 07 07:47:15 crc kubenswrapper[4738]: I0307 07:47:15.516185 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6cz8z" podStartSLOduration=1.9695037119999999 podStartE2EDuration="3.516146913s" podCreationTimestamp="2026-03-07 07:47:12 +0000 UTC" firstStartedPulling="2026-03-07 07:47:13.455287674 +0000 UTC m=+2851.920275015" lastFinishedPulling="2026-03-07 07:47:15.001930885 +0000 UTC m=+2853.466918216" observedRunningTime="2026-03-07 07:47:15.514142249 +0000 UTC m=+2853.979129590" watchObservedRunningTime="2026-03-07 07:47:15.516146913 +0000 UTC m=+2853.981134244" Mar 07 07:47:22 crc kubenswrapper[4738]: I0307 07:47:22.589070 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:22 crc kubenswrapper[4738]: I0307 07:47:22.589927 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:22 crc kubenswrapper[4738]: I0307 07:47:22.664932 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:23 crc kubenswrapper[4738]: I0307 07:47:23.630749 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:23 crc kubenswrapper[4738]: I0307 07:47:23.699762 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:25 crc kubenswrapper[4738]: I0307 07:47:25.588521 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6cz8z" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="registry-server" containerID="cri-o://2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317" gracePeriod=2 Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.063367 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.102003 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities\") pod \"966fbd45-127e-4cc3-a7b2-f659960214eb\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.102073 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content\") pod \"966fbd45-127e-4cc3-a7b2-f659960214eb\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.102116 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhb9s\" (UniqueName: \"kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s\") pod \"966fbd45-127e-4cc3-a7b2-f659960214eb\" (UID: \"966fbd45-127e-4cc3-a7b2-f659960214eb\") " Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.102869 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities" (OuterVolumeSpecName: "utilities") pod "966fbd45-127e-4cc3-a7b2-f659960214eb" (UID: "966fbd45-127e-4cc3-a7b2-f659960214eb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.108041 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s" (OuterVolumeSpecName: "kube-api-access-xhb9s") pod "966fbd45-127e-4cc3-a7b2-f659960214eb" (UID: "966fbd45-127e-4cc3-a7b2-f659960214eb"). InnerVolumeSpecName "kube-api-access-xhb9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.147488 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "966fbd45-127e-4cc3-a7b2-f659960214eb" (UID: "966fbd45-127e-4cc3-a7b2-f659960214eb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.203663 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.203704 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/966fbd45-127e-4cc3-a7b2-f659960214eb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.203728 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhb9s\" (UniqueName: \"kubernetes.io/projected/966fbd45-127e-4cc3-a7b2-f659960214eb-kube-api-access-xhb9s\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.599854 4738 generic.go:334] "Generic (PLEG): container finished" podID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerID="2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317" exitCode=0 Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.599900 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerDied","Data":"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317"} Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.599936 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6cz8z" event={"ID":"966fbd45-127e-4cc3-a7b2-f659960214eb","Type":"ContainerDied","Data":"802c72184925c6d89ddff7e024c8f21bd4a19321de0b335e8b89d9714218b79e"} Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.599946 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6cz8z" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.599954 4738 scope.go:117] "RemoveContainer" containerID="2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.619998 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.623864 4738 scope.go:117] "RemoveContainer" containerID="42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.630575 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6cz8z"] Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.666201 4738 scope.go:117] "RemoveContainer" containerID="ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.684433 4738 scope.go:117] "RemoveContainer" containerID="2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317" Mar 07 07:47:26 crc kubenswrapper[4738]: E0307 07:47:26.685287 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317\": container with ID starting with 2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317 not found: ID does not exist" containerID="2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.685332 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317"} err="failed to get container status \"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317\": rpc error: code = NotFound desc = could not find container \"2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317\": container with ID starting with 2e15b5d60ebf7acdcccd9c94210c13f3a347279ea83c3a809602bcfb42b66317 not found: ID does not exist" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.685358 4738 scope.go:117] "RemoveContainer" containerID="42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be" Mar 07 07:47:26 crc kubenswrapper[4738]: E0307 07:47:26.689630 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be\": container with ID starting with 42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be not found: ID does not exist" containerID="42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.689678 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be"} err="failed to get container status \"42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be\": rpc error: code = NotFound desc = could not find container \"42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be\": container with ID starting with 42edae66f2e3bad1e447ac6f8cae8d57c461efa9c7430dd2b8df5991c79db3be not found: ID does not exist" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.689709 4738 scope.go:117] "RemoveContainer" containerID="ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4" Mar 07 07:47:26 crc kubenswrapper[4738]: E0307 07:47:26.689996 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4\": container with ID starting with ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4 not found: ID does not exist" containerID="ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4" Mar 07 07:47:26 crc kubenswrapper[4738]: I0307 07:47:26.690029 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4"} err="failed to get container status \"ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4\": rpc error: code = NotFound desc = could not find container \"ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4\": container with ID starting with ca01ab5354d363d466c857e0f7546c1db6e9c8e5ef635d5bfb4058ca6f7e9dc4 not found: ID does not exist" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.406589 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" path="/var/lib/kubelet/pods/966fbd45-127e-4cc3-a7b2-f659960214eb/volumes" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.882939 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:28 crc kubenswrapper[4738]: E0307 07:47:28.886263 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="extract-utilities" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.886301 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="extract-utilities" Mar 07 07:47:28 crc kubenswrapper[4738]: E0307 07:47:28.886324 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="extract-content" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.886338 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="extract-content" Mar 07 07:47:28 crc kubenswrapper[4738]: E0307 07:47:28.886381 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="registry-server" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.886395 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="registry-server" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.886681 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="966fbd45-127e-4cc3-a7b2-f659960214eb" containerName="registry-server" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.888752 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:28 crc kubenswrapper[4738]: I0307 07:47:28.893256 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.049261 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.049357 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wg7l\" (UniqueName: \"kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.049407 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.151289 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wg7l\" (UniqueName: \"kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.151392 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.151505 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.152004 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.152178 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.184986 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wg7l\" (UniqueName: \"kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l\") pod \"redhat-operators-8qrx4\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.211685 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:29 crc kubenswrapper[4738]: I0307 07:47:29.667164 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:30 crc kubenswrapper[4738]: I0307 07:47:30.638752 4738 generic.go:334] "Generic (PLEG): container finished" podID="9caa21ea-b792-46ed-a974-2949f38e692d" containerID="9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7" exitCode=0 Mar 07 07:47:30 crc kubenswrapper[4738]: I0307 07:47:30.638807 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerDied","Data":"9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7"} Mar 07 07:47:30 crc kubenswrapper[4738]: I0307 07:47:30.638977 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerStarted","Data":"c34f0f625d30656bfb0a50547d6ae35217a0e371226ba94ca861dd3d79a2bdb2"} Mar 07 07:47:32 crc kubenswrapper[4738]: I0307 07:47:32.659923 4738 generic.go:334] "Generic (PLEG): container finished" podID="9caa21ea-b792-46ed-a974-2949f38e692d" containerID="eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b" exitCode=0 Mar 07 07:47:32 crc kubenswrapper[4738]: I0307 07:47:32.660018 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerDied","Data":"eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b"} Mar 07 07:47:33 crc kubenswrapper[4738]: I0307 07:47:33.673712 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerStarted","Data":"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171"} Mar 07 07:47:33 crc kubenswrapper[4738]: I0307 07:47:33.704010 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-8qrx4" podStartSLOduration=3.127760206 podStartE2EDuration="5.703991444s" podCreationTimestamp="2026-03-07 07:47:28 +0000 UTC" firstStartedPulling="2026-03-07 07:47:30.640717778 +0000 UTC m=+2869.105705099" lastFinishedPulling="2026-03-07 07:47:33.216948976 +0000 UTC m=+2871.681936337" observedRunningTime="2026-03-07 07:47:33.700916132 +0000 UTC m=+2872.165903463" watchObservedRunningTime="2026-03-07 07:47:33.703991444 +0000 UTC m=+2872.168978785" Mar 07 07:47:39 crc kubenswrapper[4738]: I0307 07:47:39.212519 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:39 crc kubenswrapper[4738]: I0307 07:47:39.213145 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:40 crc kubenswrapper[4738]: I0307 07:47:40.291809 4738 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-8qrx4" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="registry-server" probeResult="failure" output=< Mar 07 07:47:40 crc kubenswrapper[4738]: timeout: failed to connect service ":50051" within 1s Mar 07 07:47:40 crc kubenswrapper[4738]: > Mar 07 07:47:49 crc kubenswrapper[4738]: I0307 07:47:49.277576 4738 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:49 crc kubenswrapper[4738]: I0307 07:47:49.333204 4738 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:49 crc kubenswrapper[4738]: I0307 07:47:49.517325 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:50 crc kubenswrapper[4738]: I0307 07:47:50.902361 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-8qrx4" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="registry-server" containerID="cri-o://5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171" gracePeriod=2 Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.299348 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.405045 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content\") pod \"9caa21ea-b792-46ed-a974-2949f38e692d\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.405114 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities\") pod \"9caa21ea-b792-46ed-a974-2949f38e692d\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.405268 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wg7l\" (UniqueName: \"kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l\") pod \"9caa21ea-b792-46ed-a974-2949f38e692d\" (UID: \"9caa21ea-b792-46ed-a974-2949f38e692d\") " Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.406191 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities" (OuterVolumeSpecName: "utilities") pod "9caa21ea-b792-46ed-a974-2949f38e692d" (UID: "9caa21ea-b792-46ed-a974-2949f38e692d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.414328 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l" (OuterVolumeSpecName: "kube-api-access-8wg7l") pod "9caa21ea-b792-46ed-a974-2949f38e692d" (UID: "9caa21ea-b792-46ed-a974-2949f38e692d"). InnerVolumeSpecName "kube-api-access-8wg7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.507357 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wg7l\" (UniqueName: \"kubernetes.io/projected/9caa21ea-b792-46ed-a974-2949f38e692d-kube-api-access-8wg7l\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.507391 4738 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-utilities\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.546727 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9caa21ea-b792-46ed-a974-2949f38e692d" (UID: "9caa21ea-b792-46ed-a974-2949f38e692d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.609386 4738 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9caa21ea-b792-46ed-a974-2949f38e692d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.912532 4738 generic.go:334] "Generic (PLEG): container finished" podID="9caa21ea-b792-46ed-a974-2949f38e692d" containerID="5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171" exitCode=0 Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.912583 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerDied","Data":"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171"} Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.912623 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-8qrx4" event={"ID":"9caa21ea-b792-46ed-a974-2949f38e692d","Type":"ContainerDied","Data":"c34f0f625d30656bfb0a50547d6ae35217a0e371226ba94ca861dd3d79a2bdb2"} Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.912643 4738 scope.go:117] "RemoveContainer" containerID="5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.912670 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-8qrx4" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.935491 4738 scope.go:117] "RemoveContainer" containerID="eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.965469 4738 scope.go:117] "RemoveContainer" containerID="9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7" Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.972089 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:51 crc kubenswrapper[4738]: I0307 07:47:51.981632 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-8qrx4"] Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.005197 4738 scope.go:117] "RemoveContainer" containerID="5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171" Mar 07 07:47:52 crc kubenswrapper[4738]: E0307 07:47:52.005622 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171\": container with ID starting with 5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171 not found: ID does not exist" containerID="5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.005665 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171"} err="failed to get container status \"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171\": rpc error: code = NotFound desc = could not find container \"5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171\": container with ID starting with 5a719ed30074f25ae0c870d0dc8678bd7095f184c0652d847a1e3fafd9b8a171 not found: ID does not exist" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.005692 4738 scope.go:117] "RemoveContainer" containerID="eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b" Mar 07 07:47:52 crc kubenswrapper[4738]: E0307 07:47:52.006052 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b\": container with ID starting with eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b not found: ID does not exist" containerID="eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.006106 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b"} err="failed to get container status \"eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b\": rpc error: code = NotFound desc = could not find container \"eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b\": container with ID starting with eb12feb2d9c33436b95c5ee7646d46fde4f02c78c5f389a876ad35cbe9dff48b not found: ID does not exist" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.006139 4738 scope.go:117] "RemoveContainer" containerID="9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7" Mar 07 07:47:52 crc kubenswrapper[4738]: E0307 07:47:52.006541 4738 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7\": container with ID starting with 9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7 not found: ID does not exist" containerID="9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.006607 4738 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7"} err="failed to get container status \"9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7\": rpc error: code = NotFound desc = could not find container \"9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7\": container with ID starting with 9a9ad8da9451af2e1eb164ae5644cf9110d8093eff527f96115de533f3fd92f7 not found: ID does not exist" Mar 07 07:47:52 crc kubenswrapper[4738]: I0307 07:47:52.401058 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" path="/var/lib/kubelet/pods/9caa21ea-b792-46ed-a974-2949f38e692d/volumes" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.617708 4738 scope.go:117] "RemoveContainer" containerID="f34ad99a874273df7f08265b145c56f9ef18864f84eccea2b3f72bbe7a780f4d" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.650090 4738 scope.go:117] "RemoveContainer" containerID="5b4d45e959be0a7c1b5187901c7243a2d5e1c9b82bcb5b528b971370af455bae" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.675051 4738 scope.go:117] "RemoveContainer" containerID="82670cda4a5da5bd69a732e07039acbd39db98f89ace22d01c5252fb0885d12d" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.701320 4738 scope.go:117] "RemoveContainer" containerID="93997d9d5fc597655aaee18b2640ecc848ecc8938fc2417e886114a31e1f9b73" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.733509 4738 scope.go:117] "RemoveContainer" containerID="0605d2b50860efebc197652f404ef9b74dbb5d41b438c852bb1d892ad668ce65" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.763332 4738 scope.go:117] "RemoveContainer" containerID="8e099504f7ca0eaf8d64b32848961c4b8c74ca54eed1f418d78ef2964fab477f" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.794049 4738 scope.go:117] "RemoveContainer" containerID="0ad4e62b9900bd9605c15226b29c3f46304ff37e1a122f233e2bda5cdb38646b" Mar 07 07:47:54 crc kubenswrapper[4738]: I0307 07:47:54.816142 4738 scope.go:117] "RemoveContainer" containerID="cec6a0ef6754c78b76ee26c02e2903841b1c101f43bd2ea7e217a3f5e5f7561b" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.153995 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547828-75nt4"] Mar 07 07:48:00 crc kubenswrapper[4738]: E0307 07:48:00.154727 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="registry-server" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.154738 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="registry-server" Mar 07 07:48:00 crc kubenswrapper[4738]: E0307 07:48:00.154764 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="extract-content" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.154769 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="extract-content" Mar 07 07:48:00 crc kubenswrapper[4738]: E0307 07:48:00.154783 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="extract-utilities" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.154790 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="extract-utilities" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.154917 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="9caa21ea-b792-46ed-a974-2949f38e692d" containerName="registry-server" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.155413 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.158098 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.158698 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.158808 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.167414 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-75nt4"] Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.349052 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pznx6\" (UniqueName: \"kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6\") pod \"auto-csr-approver-29547828-75nt4\" (UID: \"4abc42f2-4ec8-4614-b4d6-951f435f014b\") " pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.451129 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pznx6\" (UniqueName: \"kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6\") pod \"auto-csr-approver-29547828-75nt4\" (UID: \"4abc42f2-4ec8-4614-b4d6-951f435f014b\") " pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.494947 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pznx6\" (UniqueName: \"kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6\") pod \"auto-csr-approver-29547828-75nt4\" (UID: \"4abc42f2-4ec8-4614-b4d6-951f435f014b\") " pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:00 crc kubenswrapper[4738]: I0307 07:48:00.777758 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:01 crc kubenswrapper[4738]: I0307 07:48:01.250293 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547828-75nt4"] Mar 07 07:48:01 crc kubenswrapper[4738]: I0307 07:48:01.999507 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-75nt4" event={"ID":"4abc42f2-4ec8-4614-b4d6-951f435f014b","Type":"ContainerStarted","Data":"e87835b948cdd995689b964652b8ab428b3fa47a4150413875de715117fa9cde"} Mar 07 07:48:03 crc kubenswrapper[4738]: I0307 07:48:03.013640 4738 generic.go:334] "Generic (PLEG): container finished" podID="4abc42f2-4ec8-4614-b4d6-951f435f014b" containerID="bcf9ef7eb9c6d1d2e6e86cd460bd8744ac793d3e085ed66d5b8b40253141b9cb" exitCode=0 Mar 07 07:48:03 crc kubenswrapper[4738]: I0307 07:48:03.013698 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-75nt4" event={"ID":"4abc42f2-4ec8-4614-b4d6-951f435f014b","Type":"ContainerDied","Data":"bcf9ef7eb9c6d1d2e6e86cd460bd8744ac793d3e085ed66d5b8b40253141b9cb"} Mar 07 07:48:04 crc kubenswrapper[4738]: I0307 07:48:04.291129 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:04 crc kubenswrapper[4738]: I0307 07:48:04.407477 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pznx6\" (UniqueName: \"kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6\") pod \"4abc42f2-4ec8-4614-b4d6-951f435f014b\" (UID: \"4abc42f2-4ec8-4614-b4d6-951f435f014b\") " Mar 07 07:48:04 crc kubenswrapper[4738]: I0307 07:48:04.420273 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6" (OuterVolumeSpecName: "kube-api-access-pznx6") pod "4abc42f2-4ec8-4614-b4d6-951f435f014b" (UID: "4abc42f2-4ec8-4614-b4d6-951f435f014b"). InnerVolumeSpecName "kube-api-access-pznx6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:48:04 crc kubenswrapper[4738]: I0307 07:48:04.509390 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pznx6\" (UniqueName: \"kubernetes.io/projected/4abc42f2-4ec8-4614-b4d6-951f435f014b-kube-api-access-pznx6\") on node \"crc\" DevicePath \"\"" Mar 07 07:48:05 crc kubenswrapper[4738]: I0307 07:48:05.031805 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547828-75nt4" event={"ID":"4abc42f2-4ec8-4614-b4d6-951f435f014b","Type":"ContainerDied","Data":"e87835b948cdd995689b964652b8ab428b3fa47a4150413875de715117fa9cde"} Mar 07 07:48:05 crc kubenswrapper[4738]: I0307 07:48:05.032129 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87835b948cdd995689b964652b8ab428b3fa47a4150413875de715117fa9cde" Mar 07 07:48:05 crc kubenswrapper[4738]: I0307 07:48:05.031851 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547828-75nt4" Mar 07 07:48:05 crc kubenswrapper[4738]: I0307 07:48:05.362771 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-j4n2s"] Mar 07 07:48:05 crc kubenswrapper[4738]: I0307 07:48:05.370106 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547822-j4n2s"] Mar 07 07:48:06 crc kubenswrapper[4738]: I0307 07:48:06.394487 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a15887fc-8d37-4d3f-a195-b17ecf256717" path="/var/lib/kubelet/pods/a15887fc-8d37-4d3f-a195-b17ecf256717/volumes" Mar 07 07:48:54 crc kubenswrapper[4738]: I0307 07:48:54.976614 4738 scope.go:117] "RemoveContainer" containerID="dde2cbe63459b0a3d58442e5ac0df81a71cbf68b5f716d45aee18730e73f6a13" Mar 07 07:48:55 crc kubenswrapper[4738]: I0307 07:48:55.029117 4738 scope.go:117] "RemoveContainer" containerID="9ff7a0f733008bdfeefa8c03f9b720ab32f582e7b8261823a50efabe1e701523" Mar 07 07:48:55 crc kubenswrapper[4738]: I0307 07:48:55.109247 4738 scope.go:117] "RemoveContainer" containerID="b2895047cf261d9a053be3204ca0c5595b0e12379a20c28bb71c36024f09edb5" Mar 07 07:48:55 crc kubenswrapper[4738]: I0307 07:48:55.138216 4738 scope.go:117] "RemoveContainer" containerID="abdc557f8c4d57a924c279189a305c6e56fd31c68cad7d4b5d232ee4d344d17b" Mar 07 07:48:56 crc kubenswrapper[4738]: I0307 07:48:56.958425 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:48:56 crc kubenswrapper[4738]: I0307 07:48:56.958765 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:26 crc kubenswrapper[4738]: I0307 07:49:26.957394 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:49:26 crc kubenswrapper[4738]: I0307 07:49:26.958052 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:56 crc kubenswrapper[4738]: I0307 07:49:56.957234 4738 patch_prober.go:28] interesting pod/machine-config-daemon-t7vcc container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 07:49:56 crc kubenswrapper[4738]: I0307 07:49:56.957764 4738 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 07:49:56 crc kubenswrapper[4738]: I0307 07:49:56.957803 4738 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" Mar 07 07:49:56 crc kubenswrapper[4738]: I0307 07:49:56.958385 4738 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8b6cb246b831f526704e981293eee2a64b63091b8dbaca6aa0bf12d927dad461"} pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 07:49:56 crc kubenswrapper[4738]: I0307 07:49:56.958438 4738 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" podUID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerName="machine-config-daemon" containerID="cri-o://8b6cb246b831f526704e981293eee2a64b63091b8dbaca6aa0bf12d927dad461" gracePeriod=600 Mar 07 07:49:58 crc kubenswrapper[4738]: I0307 07:49:58.064529 4738 generic.go:334] "Generic (PLEG): container finished" podID="0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7" containerID="8b6cb246b831f526704e981293eee2a64b63091b8dbaca6aa0bf12d927dad461" exitCode=0 Mar 07 07:49:58 crc kubenswrapper[4738]: I0307 07:49:58.064588 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerDied","Data":"8b6cb246b831f526704e981293eee2a64b63091b8dbaca6aa0bf12d927dad461"} Mar 07 07:49:58 crc kubenswrapper[4738]: I0307 07:49:58.065786 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-t7vcc" event={"ID":"0ef6567d-1c0f-4e07-9c6d-de1166e8b8a7","Type":"ContainerStarted","Data":"5a419ad4dc022769ca0c59f48b816ffad293a2e45b8d38c4f25b896d77ca094c"} Mar 07 07:49:58 crc kubenswrapper[4738]: I0307 07:49:58.065957 4738 scope.go:117] "RemoveContainer" containerID="549cae6e92de35978bdcdfdcb7cfc6ed61538299530f0afa8502902cd0a3a7ee" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.161276 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29547830-6nfhd"] Mar 07 07:50:00 crc kubenswrapper[4738]: E0307 07:50:00.161913 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4abc42f2-4ec8-4614-b4d6-951f435f014b" containerName="oc" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.161927 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="4abc42f2-4ec8-4614-b4d6-951f435f014b" containerName="oc" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.162146 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="4abc42f2-4ec8-4614-b4d6-951f435f014b" containerName="oc" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.162828 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.164844 4738 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rpsp5" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.165059 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.165123 4738 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.189148 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-6nfhd"] Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.326574 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9rxf\" (UniqueName: \"kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf\") pod \"auto-csr-approver-29547830-6nfhd\" (UID: \"2214e976-6c5f-49d2-952d-5e006df06b95\") " pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.428589 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9rxf\" (UniqueName: \"kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf\") pod \"auto-csr-approver-29547830-6nfhd\" (UID: \"2214e976-6c5f-49d2-952d-5e006df06b95\") " pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.449529 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9rxf\" (UniqueName: \"kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf\") pod \"auto-csr-approver-29547830-6nfhd\" (UID: \"2214e976-6c5f-49d2-952d-5e006df06b95\") " pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.488998 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:00 crc kubenswrapper[4738]: I0307 07:50:00.782651 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29547830-6nfhd"] Mar 07 07:50:00 crc kubenswrapper[4738]: W0307 07:50:00.786479 4738 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2214e976_6c5f_49d2_952d_5e006df06b95.slice/crio-5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b WatchSource:0}: Error finding container 5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b: Status 404 returned error can't find the container with id 5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b Mar 07 07:50:01 crc kubenswrapper[4738]: I0307 07:50:01.113393 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" event={"ID":"2214e976-6c5f-49d2-952d-5e006df06b95","Type":"ContainerStarted","Data":"5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b"} Mar 07 07:50:02 crc kubenswrapper[4738]: I0307 07:50:02.124663 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" event={"ID":"2214e976-6c5f-49d2-952d-5e006df06b95","Type":"ContainerStarted","Data":"5aca257b4afad030e553962422de7d1066f62ae1454036af58b2002685931fa4"} Mar 07 07:50:02 crc kubenswrapper[4738]: I0307 07:50:02.157928 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" podStartSLOduration=1.39491027 podStartE2EDuration="2.157904143s" podCreationTimestamp="2026-03-07 07:50:00 +0000 UTC" firstStartedPulling="2026-03-07 07:50:00.796781636 +0000 UTC m=+3019.261768967" lastFinishedPulling="2026-03-07 07:50:01.559775519 +0000 UTC m=+3020.024762840" observedRunningTime="2026-03-07 07:50:02.143093913 +0000 UTC m=+3020.608081274" watchObservedRunningTime="2026-03-07 07:50:02.157904143 +0000 UTC m=+3020.622891504" Mar 07 07:50:03 crc kubenswrapper[4738]: I0307 07:50:03.131917 4738 generic.go:334] "Generic (PLEG): container finished" podID="2214e976-6c5f-49d2-952d-5e006df06b95" containerID="5aca257b4afad030e553962422de7d1066f62ae1454036af58b2002685931fa4" exitCode=0 Mar 07 07:50:03 crc kubenswrapper[4738]: I0307 07:50:03.131959 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" event={"ID":"2214e976-6c5f-49d2-952d-5e006df06b95","Type":"ContainerDied","Data":"5aca257b4afad030e553962422de7d1066f62ae1454036af58b2002685931fa4"} Mar 07 07:50:04 crc kubenswrapper[4738]: I0307 07:50:04.479669 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:04 crc kubenswrapper[4738]: I0307 07:50:04.615683 4738 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9rxf\" (UniqueName: \"kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf\") pod \"2214e976-6c5f-49d2-952d-5e006df06b95\" (UID: \"2214e976-6c5f-49d2-952d-5e006df06b95\") " Mar 07 07:50:04 crc kubenswrapper[4738]: I0307 07:50:04.621508 4738 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf" (OuterVolumeSpecName: "kube-api-access-h9rxf") pod "2214e976-6c5f-49d2-952d-5e006df06b95" (UID: "2214e976-6c5f-49d2-952d-5e006df06b95"). InnerVolumeSpecName "kube-api-access-h9rxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 07:50:04 crc kubenswrapper[4738]: I0307 07:50:04.717408 4738 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9rxf\" (UniqueName: \"kubernetes.io/projected/2214e976-6c5f-49d2-952d-5e006df06b95-kube-api-access-h9rxf\") on node \"crc\" DevicePath \"\"" Mar 07 07:50:05 crc kubenswrapper[4738]: I0307 07:50:05.152575 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" event={"ID":"2214e976-6c5f-49d2-952d-5e006df06b95","Type":"ContainerDied","Data":"5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b"} Mar 07 07:50:05 crc kubenswrapper[4738]: I0307 07:50:05.152660 4738 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5912cdd8265ffa57cedfd32bfe85d18681f6fb217ddd3820dd5dbbe5e2b6e03b" Mar 07 07:50:05 crc kubenswrapper[4738]: I0307 07:50:05.152702 4738 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29547830-6nfhd" Mar 07 07:50:05 crc kubenswrapper[4738]: I0307 07:50:05.549092 4738 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-kkjb7"] Mar 07 07:50:05 crc kubenswrapper[4738]: I0307 07:50:05.558176 4738 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29547824-kkjb7"] Mar 07 07:50:06 crc kubenswrapper[4738]: I0307 07:50:06.398625 4738 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa" path="/var/lib/kubelet/pods/22b7b9e4-201f-4958-8fd1-2a2f3c05fbaa/volumes" Mar 07 07:50:55 crc kubenswrapper[4738]: I0307 07:50:55.238347 4738 scope.go:117] "RemoveContainer" containerID="f4800bb489ba2fd073e2e345b84b82822f470703d65ee021831a67a03d6914e5" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.751947 4738 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x6scf"] Mar 07 07:51:08 crc kubenswrapper[4738]: E0307 07:51:08.753117 4738 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2214e976-6c5f-49d2-952d-5e006df06b95" containerName="oc" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.753134 4738 state_mem.go:107] "Deleted CPUSet assignment" podUID="2214e976-6c5f-49d2-952d-5e006df06b95" containerName="oc" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.753422 4738 memory_manager.go:354] "RemoveStaleState removing state" podUID="2214e976-6c5f-49d2-952d-5e006df06b95" containerName="oc" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.754705 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.763072 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6scf"] Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.768096 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-utilities\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.768148 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbd9p\" (UniqueName: \"kubernetes.io/projected/29c35816-9ba6-4c18-8b71-2998e70edb67-kube-api-access-sbd9p\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.768343 4738 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-catalog-content\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.870075 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-utilities\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.870123 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbd9p\" (UniqueName: \"kubernetes.io/projected/29c35816-9ba6-4c18-8b71-2998e70edb67-kube-api-access-sbd9p\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.870197 4738 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-catalog-content\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.870619 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-utilities\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.870675 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29c35816-9ba6-4c18-8b71-2998e70edb67-catalog-content\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:08 crc kubenswrapper[4738]: I0307 07:51:08.895554 4738 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbd9p\" (UniqueName: \"kubernetes.io/projected/29c35816-9ba6-4c18-8b71-2998e70edb67-kube-api-access-sbd9p\") pod \"community-operators-x6scf\" (UID: \"29c35816-9ba6-4c18-8b71-2998e70edb67\") " pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:09 crc kubenswrapper[4738]: I0307 07:51:09.083378 4738 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x6scf" Mar 07 07:51:09 crc kubenswrapper[4738]: I0307 07:51:09.603863 4738 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x6scf"] Mar 07 07:51:09 crc kubenswrapper[4738]: I0307 07:51:09.745659 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6scf" event={"ID":"29c35816-9ba6-4c18-8b71-2998e70edb67","Type":"ContainerStarted","Data":"27e2027833690e42759ace1179fb12e478c20caab4470aaee649678645d18bd2"} Mar 07 07:51:10 crc kubenswrapper[4738]: I0307 07:51:10.755593 4738 generic.go:334] "Generic (PLEG): container finished" podID="29c35816-9ba6-4c18-8b71-2998e70edb67" containerID="a4fe6134e363947e2a369a93ffa4727162a6a4fc7abb22e36ef9ec690b41d839" exitCode=0 Mar 07 07:51:10 crc kubenswrapper[4738]: I0307 07:51:10.755870 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6scf" event={"ID":"29c35816-9ba6-4c18-8b71-2998e70edb67","Type":"ContainerDied","Data":"a4fe6134e363947e2a369a93ffa4727162a6a4fc7abb22e36ef9ec690b41d839"} Mar 07 07:51:10 crc kubenswrapper[4738]: I0307 07:51:10.758613 4738 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 07:51:11 crc kubenswrapper[4738]: I0307 07:51:11.766692 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6scf" event={"ID":"29c35816-9ba6-4c18-8b71-2998e70edb67","Type":"ContainerStarted","Data":"e6192c583153a70ba59676354d01dc4fcacebfabcfb09cb91d21537a6aa18a10"} Mar 07 07:51:12 crc kubenswrapper[4738]: I0307 07:51:12.804180 4738 generic.go:334] "Generic (PLEG): container finished" podID="29c35816-9ba6-4c18-8b71-2998e70edb67" containerID="e6192c583153a70ba59676354d01dc4fcacebfabcfb09cb91d21537a6aa18a10" exitCode=0 Mar 07 07:51:12 crc kubenswrapper[4738]: I0307 07:51:12.804334 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6scf" event={"ID":"29c35816-9ba6-4c18-8b71-2998e70edb67","Type":"ContainerDied","Data":"e6192c583153a70ba59676354d01dc4fcacebfabcfb09cb91d21537a6aa18a10"} Mar 07 07:51:13 crc kubenswrapper[4738]: I0307 07:51:13.815449 4738 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x6scf" event={"ID":"29c35816-9ba6-4c18-8b71-2998e70edb67","Type":"ContainerStarted","Data":"276fba2f9d75001f45525e76f33c81cb77dafd5caec03541f18d1d195335eb8c"} Mar 07 07:51:13 crc kubenswrapper[4738]: I0307 07:51:13.842095 4738 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x6scf" podStartSLOduration=3.364291692 podStartE2EDuration="5.842076568s" podCreationTimestamp="2026-03-07 07:51:08 +0000 UTC" firstStartedPulling="2026-03-07 07:51:10.758214583 +0000 UTC m=+3089.223201944" lastFinishedPulling="2026-03-07 07:51:13.235999479 +0000 UTC m=+3091.700986820" observedRunningTime="2026-03-07 07:51:13.838572283 +0000 UTC m=+3092.303559604" watchObservedRunningTime="2026-03-07 07:51:13.842076568 +0000 UTC m=+3092.307063899"